Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed
5 Pages  1 2 3 > » Bottom

Outline · [ Standard ] · Linear+

 NVIDIA GeForce Community V15 (new era pascal), ALL HAIL NEW PASCAL KING GTX1080 out now

views
     
TSskylinelover
post Jul 8 2015, 08:50 PM, updated 10y ago

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(skylinelover @ Jan 7 2015, 06:36 AM)
user posted image

Nvidia (NASDAQ: NVDA; /?n'v?di?/ in-vid-ee-?) is an American global technology company based in Santa Clara, California. The company invented the graphics processing unit (GPU) in 1999. GPUs drive the computer graphics in games and in applications used by professional designers. Their parallel processing capabilities provide researchers and scientists with the ability to efficiently run high-performance applications, and they are deployed in supercomputing sites around the world. More recently, Nvidia has moved into the mobile computing market, where its processors power phones and tablets, as well as auto infotainment systems. Its competitors include Intel, AMD and Qualcomm.

The following are the most notable product families produced by Nvidia:
[.] GeForce - the gaming graphics processing products for which Nvidia is best known.
[.] Quadro - computer-aided design and digital content creation workstation graphics processing products.
[.] Tegra - a system on a chip series for mobile devices.
[.] Tesla - dedicated general purpose GPU for high-end image generation applications in professional and scientific fields.
[.] nForce - a motherboard chipset created by nVidia for AMD Athlon and Duron microprocessors.

FAST FACTS
> Founded in 1993
> Jen-Hsun Huang is co-founder, president and CEO
> Headquartered in Santa Clara, Calif.
> Listed with NASDAQ under the symbol NVDA in 1999
> Invented the GPU in 1999 and has shipped more than 1 billion to date
> 7,000 employees worldwide
> $4 billion in revenue in FY12
> 2,300+ patents worldwide

user posted image
NVIDIA - http://www.nvidia.com/page/home.html
GeForce Drivers - http://www.geforce.com/drivers
Blog - http://blogs.nvidia.com
*
This post has been edited by skylinelover: May 28 2016, 09:01 AM
yaphong
post Jul 8 2015, 08:51 PM

On my way
****
Senior Member
659 posts

Joined: Apr 2005


Thanks for your hard work in opening new thread!
Hehe so now we can continue to talk about 980 Ti flex.gif
llk
post Jul 8 2015, 08:55 PM

Look at all my stars!!
*******
Senior Member
4,157 posts

Joined: Jan 2003
From: KL
Thinking of watercool my gtx980ti, our ringgit so low the block +backplate nearly cost RM700
moron88
post Jul 8 2015, 09:16 PM

Getting Started
**
Junior Member
150 posts

Joined: Jun 2011


still dilemma non ref or ref 980ti.. if the difference in price is too huge, might go for ref then future cool it wit aio wc. gona hit the trigger this saturday !
PsychoHDxMachine
post Jul 8 2015, 09:23 PM

Getting Started
**
Junior Member
249 posts

Joined: Sep 2014
Less than a year 980 so fast outdated. Sad case
Hahhaaha. All talking abt 980ti. Thumb up
Minecrafter
post Jul 8 2015, 09:27 PM

ROCK N ROLL STAR
*******
Senior Member
5,043 posts

Joined: Aug 2013
From: Putrajaya


Yay for new thread!

skylinelover,inb4 GTXxxxxTi the best Ti ever made. whistling.gif
unequalteck
post Jul 8 2015, 09:29 PM

Custom member title
*******
Senior Member
2,690 posts

Joined: Dec 2008
From: Kota Kinabalu Current: Wangsa Maju


QUOTE(llk @ Jul 8 2015, 08:55 PM)
Thinking of watercool my gtx980ti, our ringgit so low the block +backplate nearly cost RM700
*
rm700??? include dhl shipping is it?
llk
post Jul 8 2015, 09:40 PM

Look at all my stars!!
*******
Senior Member
4,157 posts

Joined: Jan 2003
From: KL
QUOTE(unequalteck @ Jul 8 2015, 09:29 PM)
rm700??? include dhl shipping is it?
*
USD175.97 X 3.85 = RM677.48


Attached thumbnail(s)
Attached Image
SSJBen
post Jul 8 2015, 10:18 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


Bad time to watercool lol.
joeboto
post Jul 8 2015, 10:49 PM

insert title
*****
Senior Member
913 posts

Joined: Jun 2006


first page?

also, its a bad time to replace GPU anyway since RM is so shitty. cry.gif

This post has been edited by joeboto: Jul 8 2015, 10:51 PM
unequalteck
post Jul 8 2015, 10:50 PM

Custom member title
*******
Senior Member
2,690 posts

Joined: Dec 2008
From: Kota Kinabalu Current: Wangsa Maju


QUOTE(llk @ Jul 8 2015, 09:40 PM)
USD175.97 X 3.85 = RM677.48
*
ohmaigod... doh.gif
SUSTheHitman47
post Jul 8 2015, 11:38 PM

Nuke
******
Senior Member
1,053 posts

Joined: Sep 2009
From: In Your Mind



the topic description should be "best ti......yet" biggrin.gif
SUSHuman10
post Jul 8 2015, 11:44 PM

Look at all my stars!!
*******
Senior Member
6,774 posts

Joined: Nov 2010
I am actually curious if there's any AIB ever going to come out with extra huge VRAM (12GB) version of 980TI just like what they being doing previously. It shouldn't be much of a PCB design problem since they can simply mimic that of TitanX.
SUScrash123
post Jul 9 2015, 06:21 AM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
QUOTE(moron88 @ Jul 8 2015, 09:16 PM)
still dilemma non ref or ref 980ti.. if the difference in price is too huge, might go for ref then future cool it wit aio wc. gona hit the trigger this saturday !
*
i buy zotac gtx 980ti amp for 2899..someone offer me palit gtx 980 ti super jetstream for 2850 and got free batman game..so i think go non ref since it is bad day to build wc coz our ringgit so low

This post has been edited by crash123: Jul 9 2015, 06:22 AM
shikimori
post Jul 9 2015, 06:53 AM

Getting Started
**
Junior Member
236 posts

Joined: Jul 2007
From: Penang


QUOTE(moron88 @ Jul 8 2015, 09:16 PM)
still dilemma non ref or ref 980ti.. if the difference in price is too huge, might go for ref then future cool it wit aio wc. gona hit the trigger this saturday !
*
why not go for lower card 970 or 980 its not that bad if gaming for 1080p

980ti kinda overkill lol at least for now . Unless you are talking about 1440p (just nice) or 4k gaming (need to lower some settings)
finecut
post Jul 9 2015, 08:51 AM

Casual
***
Junior Member
497 posts

Joined: Jan 2008
From: Seventh Heaven

QUOTE(shikimori @ Jul 9 2015, 06:53 AM)
why not go for lower card 970 or 980 its not that bad if gaming for 1080p

980ti kinda overkill lol at least for now . Unless you are talking about 1440p (just nice) or 4k gaming (need to lower some settings)
*
1440 down sampling smile.gif
yaphong
post Jul 9 2015, 10:01 AM

On my way
****
Senior Member
659 posts

Joined: Apr 2005


QUOTE(shikimori @ Jul 9 2015, 06:53 AM)
why not go for lower card 970 or 980 its not that bad if gaming for 1080p

980ti kinda overkill lol at least for now . Unless you are talking about 1440p (just nice) or 4k gaming (need to lower some settings)
*
I am on 1440p, but 980 Ti only shine if you play a lot of latest games in highest settings...
SSJBen
post Jul 9 2015, 03:44 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(moron88 @ Jul 8 2015, 09:16 PM)
still dilemma non ref or ref 980ti.. if the difference in price is too huge, might go for ref then future cool it wit aio wc. gona hit the trigger this saturday !
*
Frankly speaking, 980Ti makes very little sense at 1080p/60fps. Unless you're using a 120/144hz monitor and want to run the latest games at those frames locked, the 980Ti is not a good card at 1080p. You wouldn't appreciate the difference in performance between a 970 and 980Ti, not at 60fps.

You may downsample games, but understand that even until today, not all games works properly with downsampling. It's still quite a 60/40 thing.

But hey, it's your money though at the end of the day.


QUOTE(crash123 @ Jul 9 2015, 06:21 AM)
i buy zotac gtx 980ti amp for 2899..someone offer me palit gtx 980 ti super jetstream for 2850 and got free batman game..so i think go non ref since it is bad day to build wc coz our ringgit so low
*
Well, both the Zotac GTX980 Ti Amp and Palit 980Ti Super Jetstream is using a 99% identical PCB and components to that of the Reference 980Ti.

People need to understand that just because there is a different cooler on an SKU, it does not always mean that the PCB is third-party.

This post has been edited by SSJBen: Jul 9 2015, 03:47 PM
goldfries
post Jul 9 2015, 03:47 PM

40K Club
Group Icon
Forum Admin
44,415 posts

Joined: Jan 2003




QUOTE(yaphong @ Jul 9 2015, 10:01 AM)
I am on 1440p, but 980 Ti only shine if you play a lot of latest games in highest settings...
980 Ti on Full HD - well it's gonna be blazing fast, you probably won't feel the difference between 980 Ti to 970, depending on game.

980 Ti on 4K - you MUST SLI it. A single 980 Ti may still find it difficult to cope with high resolution and high detail BUT because it has 6GB RAM it's awesome when paired with extra horsepower.

yaphong
post Jul 9 2015, 04:14 PM

On my way
****
Senior Member
659 posts

Joined: Apr 2005


QUOTE(goldfries @ Jul 9 2015, 03:47 PM)
980 Ti on Full HD - well it's gonna be blazing fast, you probably won't feel the difference between 980 Ti to 970, depending on game.

*
Yeah especially if the games are run at V-sync. I still find for performance over watts ratio, 980 is better than 980 Ti / 970...
SUScrash123
post Jul 9 2015, 04:38 PM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
QUOTE(SSJBen @ Jul 9 2015, 03:44 PM)
Frankly speaking, 980Ti makes very little sense at 1080p/60fps. Unless you're using a 120/144hz monitor and want to run the latest games at those frames locked, the 980Ti is not a good card at 1080p. You wouldn't appreciate the difference in performance between a 970 and 980Ti, not at 60fps.

You may downsample games, but understand that even until today, not all games works properly with downsampling. It's still quite a 60/40 thing.

But hey, it's your money though at the end of the day.
Well, both the Zotac GTX980 Ti Amp and Palit 980Ti Super Jetstream is using a 99% identical PCB and components to that of the Reference 980Ti.

People need to understand that just because there is a different cooler on an SKU, it does not always mean that the PCB is third-party.
*
ooo..i think u want non reference cooler..my bad..take asus strix or msi g1 gaming..i think both are custom pcb..price around 3.3-3.4k drool.gif
cstkl1
post Jul 9 2015, 05:12 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(crash123 @ Jul 9 2015, 04:38 PM)
ooo..i think u want non reference cooler..my bad..take asus strix or msi g1 gaming..i think both are custom pcb..price around 3.3-3.4k  drool.gif
*
Strix sux unless u want to wc. Normally voltage unlocked.
MSI G1 Gaming.. looks good atm.

Zotac Amp Extreme temps are insane. U guys can check it out on the 11th.
goldfries
post Jul 9 2015, 05:21 PM

40K Club
Group Icon
Forum Admin
44,415 posts

Joined: Jan 2003




Insane in good or bad way?
KajumaO2
post Jul 9 2015, 06:20 PM

On my way
****
Senior Member
526 posts

Joined: May 2012
QUOTE(goldfries @ Jul 9 2015, 05:21 PM)
Insane in good or bad way?
*
My Amp Extreme is getting 56C under full load........... in air-conditioned room. With out AC 60 Tops
SSJBen
post Jul 9 2015, 09:01 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(crash123 @ Jul 9 2015, 04:38 PM)
ooo..i think u want non reference cooler..my bad..take asus strix or msi g1 gaming..i think both are custom pcb..price around 3.3-3.4k  drool.gif
*
Err I think you mistaken me for someone else?


QUOTE(cstkl1 @ Jul 9 2015, 05:12 PM)
Strix sux unless u want to wc. Normally voltage unlocked.
MSI G1 Gaming.. looks good atm.

Zotac Amp Extreme temps are insane. U guys can check it out on the 11th.
*
Strix, MSI Gaming, G1 gaming, all voltage locked. Cannot bump above 1.25v, even with bios modding.



QUOTE(goldfries @ Jul 9 2015, 05:21 PM)
Insane in good or bad way?
*
In a good way I believe. It is however a little noisy, still tolerable of course.
goldfries
post Jul 9 2015, 09:19 PM

40K Club
Group Icon
Forum Admin
44,415 posts

Joined: Jan 2003




QUOTE(KajumaO2 @ Jul 9 2015, 06:20 PM)
My Amp Extreme is getting 56C under full load........... in air-conditioned room. With out AC 60 Tops
*
What's the noise level / fan RPM at that reading?
cstkl1
post Jul 9 2015, 09:36 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(SSJBen @ Jul 9 2015, 09:01 PM)
Err I think you mistaken me for someone else?
Strix, MSI Gaming, G1 gaming, all voltage locked. Cannot bump above 1.25v, even with bios modding.
In a good way I believe. It is however a little noisy, still tolerable of course.
*
Very low noise. Almost cannot hear if auto

Strix voltage if its like 980 is unlock.


SUSHuman10
post Jul 10 2015, 12:07 AM

Look at all my stars!!
*******
Senior Member
6,774 posts

Joined: Nov 2010
QUOTE(crash123 @ Jul 9 2015, 04:38 PM)
ooo..i think u want non reference cooler..my bad..take asus strix or msi g1 gaming..i think both are custom pcb..price around 3.3-3.4k  drool.gif
*
Techpowerup questioned MSI's configuration of going dual 8 pins but didn't really tweak the power limit of the card.
http://www.techpowerup.com/reviews/MSI/GTX..._Gaming/28.html

This post has been edited by Human10: Jul 10 2015, 12:07 AM
SUSHuman10
post Jul 10 2015, 12:22 AM

Look at all my stars!!
*******
Senior Member
6,774 posts

Joined: Nov 2010
QUOTE(KajumaO2 @ Jul 9 2015, 06:20 PM)
My Amp Extreme is getting 56C under full load........... in air-conditioned room. With out AC 60 Tops
*
Wow, that's indeed insane.

Seems like Zotac's humungous 2.5 slots cooler pay off at last. laugh.gif
cstkl1
post Jul 10 2015, 12:34 AM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(KajumaO2 @ Jul 9 2015, 06:20 PM)
My Amp Extreme is getting 56C under full load........... in air-conditioned room. With out AC 60 Tops
*
yup. insane temps and very silent on auto. Seriously so close to WC.

The best part this card doesnt have the bug that reference card throttle has. reference card throttles around 13-26mhz at 65. This card throttles based on PL/Temps. And till date havent seen it hitting the pl yet.


reconvision
post Jul 10 2015, 12:38 AM

New Member
*
Junior Member
40 posts

Joined: Dec 2013
Just got my Asus strix 980ti today. I can said the max temp on default fan setting running at 1404MHZ boost clock (gamingmode) is 75-76 celsuis in 41 % fan speed.it is very quiet. Next, i just adjust my own fan curve to 50-52% fan speed which hv slighty audible fan noise but is very acceptable. The final temp is 68-69 celsius. DONT EVEN bother to increase the fan speed to 60% becoz is like blower Lol! Extremely annoying! Oh, at last very very very minimal coil whine. Almost none. Sorry guys! Not gonna touch overclocking yet.
r1nk
post Jul 10 2015, 05:50 AM

Casual
***
Junior Member
456 posts

Joined: Feb 2009



guys my pc showing "Display driver stopped responding and has recovered" like every 2 days like that.. im using gtx970.. is there any solution?
its making me annoyed..
reconvision
post Jul 10 2015, 07:24 AM

New Member
*
Junior Member
40 posts

Joined: Dec 2013
QUOTE(r1nk @ Jul 10 2015, 05:50 AM)
guys my pc showing "Display driver stopped responding and has recovered" like every 2 days like that.. im using gtx970.. is there any solution?
its making me annoyed..
*
Try using 353.38 driver version. It fixes the problem for me.

queenc
post Jul 10 2015, 07:37 AM

~GGMU~
******
Senior Member
1,853 posts

Joined: Feb 2010



you guys normally set auto update driver using geforce experience or manually update it?
reconvision
post Jul 10 2015, 08:01 AM

New Member
*
Junior Member
40 posts

Joined: Dec 2013
QUOTE(queenc @ Jul 10 2015, 07:37 AM)
you guys normally set auto update driver using geforce experience or manually update it?
*
I usually update manually as to avoid any bad driver they release.
queenc
post Jul 10 2015, 08:05 AM

~GGMU~
******
Senior Member
1,853 posts

Joined: Feb 2010



QUOTE(reconvision @ Jul 10 2015, 08:01 AM)
I usually update manually as to avoid any bad driver they release.
*
isee.. my gtx770 facing driver issue/error when playing motogp15 or gta5 after latest drive update via geforce exp/. sweat.gif
reconvision
post Jul 10 2015, 08:11 AM

New Member
*
Junior Member
40 posts

Joined: Dec 2013
QUOTE(queenc @ Jul 10 2015, 08:05 AM)
isee.. my gtx770 facing driver issue/error when playing motogp15 or gta5 after latest drive update via geforce exp/. sweat.gif
*
I dun hv gta 5 or motogp15 so I hv no idea which driver is suitable for those game. What I heard is 353.38 is less problematic. May be u can try Tat out.
KajumaO2
post Jul 10 2015, 09:53 AM

On my way
****
Senior Member
526 posts

Joined: May 2012
QUOTE(goldfries @ Jul 9 2015, 09:19 PM)
What's the noise level / fan RPM at that reading?
*
As for noise my H100I is louder in a push pull config than the card, as for the rpm i didnt check its default config on afterburner.
TSskylinelover
post Jul 10 2015, 09:58 AM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(queenc @ Jul 10 2015, 08:05 AM)
isee.. my gtx770 facing driver issue/error when playing motogp15 or gta5 after latest drive update via geforce exp/. sweat.gif
*
Download hotfix hahahaha
queenc
post Jul 10 2015, 10:34 AM

~GGMU~
******
Senior Member
1,853 posts

Joined: Feb 2010



QUOTE(skylinelover @ Jul 10 2015, 09:58 AM)
Download hotfix hahahaha
*
what is thjat
TSskylinelover
post Jul 10 2015, 12:10 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(queenc @ Jul 10 2015, 10:34 AM)
what is thjat
*
better than WHQD driver laugh.gif rclxms.gif

http://forums.guru3d.com/showthread.php?t=400347
shikimori
post Jul 10 2015, 07:18 PM

Getting Started
**
Junior Member
236 posts

Joined: Jul 2007
From: Penang


QUOTE(skylinelover @ Jul 10 2015, 12:10 PM)
why not this instead

http://forums.guru3d.com/showthread.php?t=400690
cstkl1
post Jul 10 2015, 08:00 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

Heard tommorow some brands got voucher rebates.

600ppl. Hope to make it. Lost in the crowd.
yaphong
post Jul 10 2015, 11:49 PM

On my way
****
Senior Member
659 posts

Joined: Apr 2005


QUOTE(reconvision @ Jul 10 2015, 12:38 AM)
Just got my Asus strix 980ti today. I can said the max temp on default fan setting running at 1404MHZ boost clock (gamingmode) is 75-76 celsuis in 41 % fan speed.it is very quiet. Next, i just adjust my own fan curve to 50-52% fan speed  which hv slighty audible fan noise but is very acceptable. The final temp is 68-69 celsius. DONT EVEN bother to increase the fan speed to 60% becoz is like blower Lol! Extremely annoying!  Oh, at last very very very minimal coil whine. Almost none. Sorry guys! Not gonna touch overclocking yet.
*
Where you buy and how much you got it for?
reconvision
post Jul 11 2015, 12:39 AM

New Member
*
Junior Member
40 posts

Joined: Dec 2013
QUOTE(yaphong @ Jul 10 2015, 11:49 PM)
Where you buy and how much you got it for?
*
check your pm.
Moogle Stiltzkin
post Jul 11 2015, 11:57 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(queenc @ Jul 10 2015, 07:37 AM)
you guys normally set auto update driver using geforce experience or manually update it?
*
i use DDU from guru3d to do a clean install everytime there is an update.


QUOTE(shikimori @ Jul 9 2015, 06:53 AM)
why not go for lower card 970 or 980 its not that bad if gaming for 1080p

980ti kinda overkill lol at least for now . Unless you are talking about 1440p (just nice) or 4k gaming (need to lower some settings)
nah i don't agree 980ti is overkill for a 1080p.


review of 980ti
http://www.trustedreviews.com/nvidia-gefor...-results-page-2


most of the review benchmarks show that for a 1440p at ultra setting can most part reach the sweet spot of 60fps+ on average.

but if your planning using a 144hz gsync monitor, you can go above the capped 60fps when using vsync, by using the gsync. So you can take advantage of the higher fps.

only old monitors without gsync, you'd be interested only that the gpu performs on average 60fps ideally (because your locked there if your using triple buffer vsync).

So i'd say 980ti even on 1080p would be great with gsync ideally the acer predator gsync 144hz ips monitor :}

1440p for 27'' monitor is these days quite capable of being driven by the gpu especially by the 980 ti.

4k seems you need to lower your graphics settings. and especially lower AA which is less needed at this resolution.

the hardocp review pointed out what kind of settings tweaks was needed when trying to get a playable setting at the highest quality possible
http://www.hardocp.com/article/2015/06/15/...rd_gpu_review/7




Anyway there is actually a topic talking about this issue 1080p @60hz (vsync) monitor vs @144hz (gsync)
https://www.reddit.com/r/buildapc/comments/...last_for_1080p/


PS: another reason i think 980ti is needed even for 1080p, back when i bought the 680gtx, it barely lasted 2-3 years before i was not able to ultra setting some of the games by then cry.gif so i rather reduce the chances of that by future proofing when possible whenever i get a gpu upgrade every 5 years or so.


This post has been edited by Moogle Stiltzkin: Jul 11 2015, 12:27 PM
marfccy
post Jul 11 2015, 02:19 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(Moogle Stiltzkin @ Jul 11 2015, 11:57 AM)
i use DDU from guru3d to do a clean install everytime there is an update.
nah i don't agree 980ti is overkill for a 1080p.
review of 980ti
http://www.trustedreviews.com/nvidia-gefor...-results-page-2
most of the review benchmarks show that for a 1440p at ultra setting can most part reach the sweet spot of 60fps+ on average.

but if your planning using a 144hz gsync monitor, you can go above the capped 60fps when using vsync, by using the gsync. So you can take advantage of the higher fps.

only old monitors without gsync, you'd be interested only that the gpu performs on average 60fps ideally (because your locked there if your using triple buffer vsync).

So i'd say 980ti even on 1080p would be great with gsync ideally the acer predator gsync 144hz ips monitor :}

1440p for 27'' monitor is these days quite capable of being driven by the gpu especially by the 980 ti.

4k seems you need to lower your graphics settings. and especially lower AA which is less needed at this resolution.

the hardocp review pointed out what kind of settings tweaks was needed when trying to get a playable setting at the highest quality possible
http://www.hardocp.com/article/2015/06/15/...rd_gpu_review/7
Anyway there is actually a topic talking about this issue  1080p @60hz (vsync) monitor vs @144hz (gsync)
https://www.reddit.com/r/buildapc/comments/...last_for_1080p/
PS: another reason i think 980ti is needed even for 1080p, back when i bought the 680gtx, it barely lasted 2-3 years before i was not able to ultra setting some of the games by then cry.gif  so i rather reduce the chances of that by future proofing when possible whenever i get a gpu upgrade every 5 years or so.
*
hey its technology, remember how fast smartphones get obsolete?

well just have to deal with it

sides, 2-3 years is pretty darn long for a graphics card
shikimori
post Jul 11 2015, 04:08 PM

Getting Started
**
Junior Member
236 posts

Joined: Jul 2007
From: Penang


QUOTE(Moogle Stiltzkin @ Jul 11 2015, 11:57 AM)
i use DDU from guru3d to do a clean install everytime there is an update.
nah i don't agree 980ti is overkill for a 1080p.
review of 980ti
http://www.trustedreviews.com/nvidia-gefor...-results-page-2
most of the review benchmarks show that for a 1440p at ultra setting can most part reach the sweet spot of 60fps+ on average.

but if your planning using a 144hz gsync monitor, you can go above the capped 60fps when using vsync, by using the gsync. So you can take advantage of the higher fps.

only old monitors without gsync, you'd be interested only that the gpu performs on average 60fps ideally (because your locked there if your using triple buffer vsync).

So i'd say 980ti even on 1080p would be great with gsync ideally the acer predator gsync 144hz ips monitor :}

1440p for 27'' monitor is these days quite capable of being driven by the gpu especially by the 980 ti.

4k seems you need to lower your graphics settings. and especially lower AA which is less needed at this resolution.

the hardocp review pointed out what kind of settings tweaks was needed when trying to get a playable setting at the highest quality possible
http://www.hardocp.com/article/2015/06/15/...rd_gpu_review/7
Anyway there is actually a topic talking about this issue  1080p @60hz (vsync) monitor vs @144hz (gsync)
https://www.reddit.com/r/buildapc/comments/...last_for_1080p/
PS: another reason i think 980ti is needed even for 1080p, back when i bought the 680gtx, it barely lasted 2-3 years before i was not able to ultra setting some of the games by then  cry.gif  so i rather reduce the chances of that by future proofing when possible whenever i get a gpu upgrade every 5 years or so.
*
well if you are talking about 1080p at 144hz monitor I suppose its fine but who wants to be stuck at that resolution ?
once you got a taste of WQHD or UHD its hard going back

Also , I have to disagree on future proofing graphic card . With the likes of NVIDIA 780 being outperform or same by 970 , the gap for new card would be higher . Not to mention the introduction of HBM ram I think I probably have to change cards every year sad.gif

Funny thing is for processor I'm still stuck with sandybridge cant see any reason to upgrade apart from power efficiency , ddr4

goldfries
post Jul 11 2015, 04:35 PM

40K Club
Group Icon
Forum Admin
44,415 posts

Joined: Jan 2003




980Ti is certainly not overkill for 1080p.

It's just perfect for it.
marfccy
post Jul 11 2015, 04:56 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(shikimori @ Jul 11 2015, 04:08 PM)
well if you are talking about 1080p at 144hz monitor I suppose its fine but who wants to be stuck at that resolution ?
once you got a taste of  WQHD or UHD its hard going back

Also , I have to disagree on future proofing graphic card .  With the likes of NVIDIA 780 being outperform or same by 970 , the gap for new card would be higher .  Not to mention the introduction of HBM ram I think I probably have to change cards every year  sad.gif

Funny thing is for processor I'm still stuck with sandybridge cant see any reason to upgrade apart from  power efficiency  , ddr4
*
DDR4 isnt much different than DDR3 beside the obvious higher clock speed

same goes for power efficiency. Devil's Canyon still has 95W TDP but just lower idles, nothing much. Skylake reported to come with 65W, but well have to wait for full Broadwell 14nm first before speculating on Skylake

theres somewhat not much reason to upgrade, CPU side wise

im on Sandy as well, and im on the fence on upgrading at all

Ideal has put Intel i7 5775C on their pricelist btw, whooping ~RM1.4k

do want, but dem its expensive
SUSHuman10
post Jul 11 2015, 05:21 PM

Look at all my stars!!
*******
Senior Member
6,774 posts

Joined: Nov 2010
QUOTE(marfccy @ Jul 11 2015, 04:56 PM)
DDR4 isnt much different than DDR3 beside the obvious higher clock speed

same goes for power efficiency. Devil's Canyon still has 95W TDP but just lower idles, nothing much. Skylake reported to come with 65W, but well have to wait for full Broadwell 14nm first before speculating on Skylake

theres somewhat not much reason to upgrade, CPU side wise

im on Sandy as well, and im on the fence on upgrading at all

Ideal has put Intel i7 5775C on their pricelist btw, whooping ~RM1.4k

do want, but dem its expensive
*
I find it ridiculous. The price almost on par with 5820k already...

It will be one of the most rare CPU in the market in future. Skylake is around the corner, it will only be useful for those who readily had H97/Z97 mobo. But again high chances are those people already had Haswell (refresh) CPU and won't consider upgrading anytime soon. Now with this insane price, not much new end consumers will opt for them. There's also news that OEMs will skip Broadwell and directly to Skylake.

So in short, it will be one rare "pokemon".
marfccy
post Jul 11 2015, 06:18 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(Human10 @ Jul 11 2015, 05:21 PM)
I find it ridiculous. The price almost on par with 5820k already...

It will be one of the most rare CPU in the market in future. Skylake is around the corner, it will only be useful for those who readily had H97/Z97 mobo. But again high chances are those people already had Haswell (refresh) CPU and won't consider upgrading anytime soon. Now with this insane price, not much new end consumers will opt for them. There's also news that OEMs will skip Broadwell and directly to Skylake.

So in short, it will be one rare "pokemon".
*
yeah, its one of those odd releases that doesnt seem to have anywhere to fit in

its suffering from the "<Insert nex gen> is near, wait it out longer" syndrome doh.gif

on the bright side of going consumer than Extreme series, you save alot from not getting LGA2011 board

Skylake at least had the decency to support DDR3L instead of pure DDR4
Moogle Stiltzkin
post Jul 11 2015, 06:31 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(shikimori @ Jul 11 2015, 04:08 PM)
well if you are talking about 1080p at 144hz monitor I suppose its fine but who wants to be stuck at that resolution ?
once you got a taste of  WQHD or UHD its hard going back

Also , I have to disagree on future proofing graphic card .  With the likes of NVIDIA 780 being outperform or same by 970 , the gap for new card would be higher .  Not to mention the introduction of HBM ram I think I probably have to change cards every year  sad.gif

Funny thing is for processor I'm still stuck with sandybridge cant see any reason to upgrade apart from  power efficiency  , ddr4
*
too many people to quote lel.

970 might be fine for 1080p for less than half the price which is substantial for most people, especially regular gamers (not truly enthusiast high end level pc gear).

but i think the 970s were 3.5gb vram hmm.gif

some gamers at ultra use almost 6vram like mordor. 1440p recommended is minimum 4gb, with 6gb being ideal. 4k need minimum 6gb. 2gb vram is hardly enough these days i know for a fact because i have a 680gtx cry.gif i notice every now and then there was an error where it said ran out of vram (no joke). Hasn't happened lately, maybe because of newer gpu drivers hmm.gif ?


anyway here is the direct comparison 980ti vs 970
http://gpuboss.com/gpus/GeForce-GTX-980-Ti...GeForce-GTX-970



here is a dragon age inquisition benchmark for 1080 and 1440p resolution using different settings from low, medium, high and ultra.
http://www.guru3d.com/articles_pages/drago...k_review,7.html


notice ultra for 970 is 50fps


for 1440p it was only 33fps for ultra
http://www.guru3d.com/articles-pages/drago...k-review,8.html





bottomline


970

pro
---------
- more than half the price of a 980ti
- sufficient for 1080p even at ultra for a heavy title like DAI



cons
-------

3.5vram instead of 4vram. How this may affect gaming
http://www.extremetech.com/extreme/198223-...emory-problem/2

ultra settings on intensive games like DAI will only reach between 40-50fps. whereas a 980ti would hit 60fps and probably higher.

if you invested in a gsync monitor with 144hz, you'd definitely want a higher end card like a 980ti to achieve much better fps thx to gsync technology :}




shikimori
post Jul 11 2015, 06:36 PM

Getting Started
**
Junior Member
236 posts

Joined: Jul 2007
From: Penang


QUOTE(Moogle Stiltzkin @ Jul 11 2015, 06:31 PM)
too many people to quote lel.

970 might be fine for 1080p for less than half the price which is substantial for most people, especially regular gamers (not truly enthusiast high end level pc gear).

but i think the 970s were 3.5gb vram  hmm.gif

some gamers at ultra use almost 6vram like mordor. 1440p recommended is minimum 4gb, with 6gb being ideal. 4k need minimum 6gb. 2gb vram is hardly enough these days i know for a fact because i have a 680gtx  cry.gif  i notice every now and then there was an error where it said ran out of vram (no joke). Hasn't happened lately, maybe because of newer gpu drivers  hmm.gif  ?
anyway here is the direct comparison 980ti vs 970
http://gpuboss.com/gpus/GeForce-GTX-980-Ti...GeForce-GTX-970
here is a dragon age inquisition benchmark for 1080 and 1440p resolution using different settings from low, medium, high and ultra.
http://www.guru3d.com/articles_pages/drago...k_review,7.html
notice ultra for 970 is 50fps
for 1440p it was only 33fps for ultra
http://www.guru3d.com/articles-pages/drago...k-review,8.html
bottomline
970

pro
---------
- more than half the price of a 980ti
- sufficient for 1080p even at ultra for a heavy title like DAI
cons
-------

3.5vram instead of 4vram. How this may affect gaming
http://www.extremetech.com/extreme/198223-...emory-problem/2

ultra settings on intensive games like DAI will only reach between 40-50fps. whereas a 980ti would hit 60fps and probably higher.

if you invested in a gsync monitor with 144hz, you'd definitely want a higher end card like a 980ti to achieve much better fps thx to gsync technology :}
*
sad.gif gsync ips 1440p monitor finally in malaysia

27" XB270HU-IPS RM2599 at Idealtech .... Sigh, if only IPMART would refund my money as soon as possible would grab this monitor without any hesitation

SSJBen
post Jul 11 2015, 06:46 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Moogle Stiltzkin @ Jul 11 2015, 06:31 PM)
too many people to quote lel.

970 might be fine for 1080p for less than half the price which is substantial for most people, especially regular gamers (not truly enthusiast high end level pc gear).

but i think the 970s were 3.5gb vram  hmm.gif

some gamers at ultra use almost 6vram like mordor. 1440p recommended is minimum 4gb, with 6gb being ideal. 4k need minimum 6gb. 2gb vram is hardly enough these days i know for a fact because i have a 680gtx  cry.gif  i notice every now and then there was an error where it said ran out of vram (no joke). Hasn't happened lately, maybe because of newer gpu drivers  hmm.gif  ?
anyway here is the direct comparison 980ti vs 970
http://gpuboss.com/gpus/GeForce-GTX-980-Ti...GeForce-GTX-970
here is a dragon age inquisition benchmark for 1080 and 1440p resolution using different settings from low, medium, high and ultra.
http://www.guru3d.com/articles_pages/drago...k_review,7.html
notice ultra for 970 is 50fps
for 1440p it was only 33fps for ultra
http://www.guru3d.com/articles-pages/drago...k-review,8.html
bottomline
970

pro
---------
- more than half the price of a 980ti
- sufficient for 1080p even at ultra for a heavy title like DAI
cons
-------

3.5vram instead of 4vram. How this may affect gaming
http://www.extremetech.com/extreme/198223-...emory-problem/2

ultra settings on intensive games like DAI will only reach between 40-50fps. whereas a 980ti would hit 60fps and probably higher.

if you invested in a gsync monitor with 144hz, you'd definitely want a higher end card like a 980ti to achieve much better fps thx to gsync technology :}
*
Still on 2x970s here, waiting for my 980Ti to arrive.

But I'm going to be frank with everyone here, the 3.5GB vram thing? It's not an issue. Honestly, it's been blown out of propotion. Stop looking at just numbers and charts, they cannot tell the whole story.

So far, only TWO games has had issues with the 970. That's Shadow of Mordor and GTAV, this is only if you max the texture resolution out. Seriously, just bump it down 1 notch. The difference in reality is hard to notice and guess what? No more "3.5gb vram limitation arghhhh!!".


I'm not going to disagree that 980Ti isn't a great card for 1080p, but it makes very little sense to buy a 980Ti to play at 1080p, 144hz (with or without Gsync). Paying a little more for a 1440p panel makes quite a significant difference, have some extra cash laying around and getting a 4k Gysnc monitor is seriously a whole new experience.
Moogle Stiltzkin
post Jul 11 2015, 06:51 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(shikimori @ Jul 11 2015, 06:36 PM)
sad.gif gsync ips 1440p monitor finally in malaysia

27" XB270HU-IPS RM2599 at Idealtech .... Sigh, if only IPMART would refund my money as soon as possible would grab this monitor without any hesitation
*
rm2599 is not bad for a ips 27'' with gsync.... and if thats not enough @ 144hz !!! oh and with stellar reviews especially their low latency, low input lag, and overdrive done correctly without issues (normal settings)
http://www.tftcentral.co.uk/reviews/acer_xb270hu.htm






i paid like almost rm 3k for my u2413 ah-ips gb-led 24'' lcd 60hz (no gsync)


sure the acer is a white led backlight, but honestly, i'd trade the gb-r led backlight for the acer monitor because of the gsync and the other stellar performance for gamers. my minimum threshold is has to be ips, because i can't stand tn panels.

a asus rog tn panel can achieve 1ms response time. The acer ips was tested with 5.5ms gtg response time. This is a stellar result for an ips panel.

This post has been edited by Moogle Stiltzkin: Jul 11 2015, 06:53 PM
SSJBen
post Jul 11 2015, 07:52 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


Except, the QC for the Acer XB270HU has not been stellar. Take a look over the OCN owner's club, the Reddit owner's club, and the Neogaf owner's club. The rate of RMAs has been quite high, even if it's not uncommon.

And remember, Acer Malaysia does not actually have the the XB27 as an official product here (or have they now, correct me if I'm wrong). If any issues, RMA will cost a lot of time and money.

This post has been edited by SSJBen: Jul 11 2015, 07:55 PM
shikimori
post Jul 11 2015, 07:58 PM

Getting Started
**
Junior Member
236 posts

Joined: Jul 2007
From: Penang


QUOTE(SSJBen @ Jul 11 2015, 07:52 PM)
Except, the QC for the Acer XB270HU has not been stellar. Take a look over the OCN owner's club, the Reddit owner's club, and the Neogaf owner's club. The rate of RMAs has been quite high, even if it's not uncommon.

And remember, Acer Malaysia does not actually have the the XB27 as an official product here (or have they now, correct me if I'm wrong). If any issues, RMA will cost a lot of time and money.
*
same goes for rog swift lol . But those two are the best choice at the moment cry.gif
KajumaO2
post Jul 11 2015, 07:59 PM

On my way
****
Senior Member
526 posts

Joined: May 2012
QUOTE(r1nk @ Jul 10 2015, 05:50 AM)
guys my pc showing "Display driver stopped responding and has recovered" like every 2 days like that.. im using gtx970.. is there any solution?
its making me annoyed..
*
Same here so dont worry wait for the next update
SSJBen
post Jul 11 2015, 08:12 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(shikimori @ Jul 11 2015, 07:58 PM)
same goes for rog swift lol . But those two are the best choice at the moment  cry.gif
*
Yeah, damn sad.

LG should come up with theirs.
Dell should to.
BenQ... jeez man, with their focus on gaming monitors, it's like dead obvious to expect them to have 2 or 3 models by now. doh.gif
Wouldn't mind Samsung either, but knowing them, they're not ones to pay for royalty hardware (so Freesync is more likely).

Heck even budget-friendly AOC should have a model by now.

I'm dissapointed that Gsync hasn't actually kicked off the way it should.
TSskylinelover
post Jul 11 2015, 08:15 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(shikimori @ Jul 11 2015, 07:58 PM)
same goes for rog swift lol . But those two are the best choice at the moment  cry.gif
*
I be keeping waiting 4 new models laugh.gif
goldfries
post Jul 11 2015, 09:23 PM

40K Club
Group Icon
Forum Admin
44,415 posts

Joined: Jan 2003




QUOTE(SSJBen @ Jul 11 2015, 06:46 PM)
But I'm going to be frank with everyone here, the 3.5GB vram thing? It's not an issue. Honestly, it's been blown out of propotion. Stop looking at just numbers and charts, they cannot tell the whole story.


this is correct.

the performance is as it is, when people buy the card, it's bought for the performance.

Having 3.5GB doesn't make it lose performance, just some people aren't happy to feel "cheated".

Technically Nvidia isn't wrong because it is 3.5GB + 0.5GB so total is still 4GB.

QUOTE(SSJBen @ Jul 11 2015, 06:46 PM)
I'm not going to disagree that 980Ti isn't a great card for 1080p, but it makes very little sense to buy a 980Ti to play at 1080p, 144hz (with or without Gsync). Paying a little more for a 1440p panel makes quite a significant difference, have some extra cash laying around and getting a 4k Gysnc monitor is seriously a whole new experience.
Why not? Practically every game will run above 60 fps on GTX 980 Ti even with settings set to the max.

If the person uses 1080p all the way then it'll last for ages.

Once gone to 4K, slap on another GTX 980 Ti. That'll be sweet.
llk
post Jul 11 2015, 09:32 PM

Look at all my stars!!
*******
Senior Member
4,157 posts

Joined: Jan 2003
From: KL
QUOTE(goldfries @ Jul 11 2015, 09:23 PM)
this is correct.

the performance is as it is, when people buy the card, it's bought for the performance.

Having 3.5GB doesn't make it lose performance, just some people aren't happy to feel "cheated".

Technically Nvidia isn't wrong because it is 3.5GB + 0.5GB so total is still 4GB.
Why not? Practically every game will run above 60 fps on GTX 980 Ti even with settings set to the max.

If the person uses 1080p all the way then it'll last for ages.

Once gone to 4K, slap on another GTX 980 Ti. That'll be sweet.
*
Yup. i'm still using the poor Dell 2312H monitor sweat.gif
mdzaboy
post Jul 11 2015, 10:23 PM

CuChee MunK KuK
*******
Senior Member
2,061 posts

Joined: Jan 2003
From: Jabaronie to Astaka Status: のあ..?



just won a lucky draw 980TI HOF today at orange estadium

thanks to Galax Malaysia...still drooling till now drool.gif drool.gif

user posted image
Minecrafter
post Jul 11 2015, 10:25 PM

ROCK N ROLL STAR
*******
Senior Member
5,043 posts

Joined: Aug 2013
From: Putrajaya


QUOTE(mdzaboy @ Jul 11 2015, 10:23 PM)
just won a lucky draw 980TI HOF today at orange estadium

thanks to Galax Malaysia...still drooling till now drool.gif  drool.gif

user posted image
*
shocking.gif Congrats!! rclxms.gif thumbup.gif
SUSHuman10
post Jul 11 2015, 10:37 PM

Look at all my stars!!
*******
Senior Member
6,774 posts

Joined: Nov 2010
QUOTE(marfccy @ Jul 11 2015, 06:18 PM)
yeah, its one of those odd releases that doesnt seem to have anywhere to fit in

its suffering from the "<Insert nex gen> is near, wait it out longer" syndrome doh.gif

on the bright side of going consumer than Extreme series, you save alot from not getting LGA2011 board

Skylake at least had the decency to support DDR3L instead of pure DDR4
*
But hey, cheapest LGA2011V3 is a Hexacores, while the 1.4k i7 is still a Quadcores.

Mobo wise, LGA2012V3 is expect to last at least another year, while H97/Z97 pretty much EOL by this year end.

If you are rich enough to invest on high end i7 (consumer) and Z97, the Haswell-E combination is just marginally pricier while significantly more "future proof" as compared to the consumer counterpart.
marfccy
post Jul 12 2015, 01:00 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(Human10 @ Jul 11 2015, 10:37 PM)
But hey, cheapest LGA2011V3 is a Hexacores, while the 1.4k i7 is still a Quadcores.

Mobo wise, LGA2012V3 is expect to last at least another year, while H97/Z97 pretty much EOL by this year end.

If you are rich enough to invest on high end i7 (consumer) and Z97, the Haswell-E combination is just marginally pricier while significantly more "future proof" as compared to the consumer counterpart.
*
in a sense, yeah you do last longer

considering the rate Intel is "improving" its performance every generation
TSskylinelover
post Jul 12 2015, 09:31 AM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
Haha. Happy mah haswell e since chinese new year. I wont be changing anymore till intel come out 5ghz stock and o/c 6ghz stable on stock cooler. laugh.gif icon_idea.gif
moron88
post Jul 12 2015, 10:13 AM

Getting Started
**
Junior Member
150 posts

Joined: Jun 2011


Trigger pulled rclxms.gif rclxms.gif rclxms.gif

user posted image

notice after installing the driver and etc etc. my 1080p monitor suddenly added another resolution 2715x1527. will my monitor even display it? lol im using LG 23EA63V
JohnLai
post Jul 12 2015, 10:42 AM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
QUOTE(mdzaboy @ Jul 11 2015, 10:23 PM)
just won a lucky draw 980TI HOF today at orange estadium

thanks to Galax Malaysia...still drooling till now drool.gif  drool.gif

user posted image
*
So....how about donating your old gpu to me for free....... brows.gif


Anyway, the galaxy cards seem good. Last time when I bought galaxy card in year 2008, its fan was very loud.
This particular 980ti is semipassive or fan running all the time?
cstkl1
post Jul 12 2015, 10:58 AM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

user posted image

Got the asus sli bridge for second rig. Upgrading its ssd while at it. Cheap man. Massive drop in prices from corsair
Decided main rig will get evga pro v2 bridge. Currently using evga pro v1 on it.
meons
post Jul 12 2015, 11:34 AM

noobies
*******
Senior Member
3,489 posts

Joined: Jul 2008
From: BUTTERWORTH pulau pinang



NVIDIA AIBs Silently Cut Prices on High-End Maxwell Cards


http://wccftech.com/nvidia-aibs-silently-c...29-gtx-980-479/




rclxms.gif rclxms.gif
marfccy
post Jul 12 2015, 02:46 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(meons @ Jul 12 2015, 11:34 AM)
NVIDIA AIBs Silently Cut Prices on High-End Maxwell Cards
http://wccftech.com/nvidia-aibs-silently-c...29-gtx-980-479/
rclxms.gif  rclxms.gif
*
sadly these wont reflect much in Msia sweat.gif

only in Msia, prices go up, never down
Fizzy-Fiz
post Jul 12 2015, 02:59 PM

On my way
****
Senior Member
696 posts

Joined: Dec 2009
QUOTE(cstkl1 @ Jul 12 2015, 10:58 AM)
user posted image

Got the asus sli bridge for second rig. Upgrading its ssd while at it. Cheap man. Massive drop in prices from corsair
Decided main rig will get evga pro v2 bridge. Currently using evga pro v1 on it.
*
its even cheaper in garage, whats happening?

SSJBen
post Jul 12 2015, 03:42 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(goldfries @ Jul 11 2015, 09:23 PM)
this is correct.

the performance is as it is, when people buy the card, it's bought for the performance.

Having 3.5GB doesn't make it lose performance, just some people aren't happy to feel "cheated".

Technically Nvidia isn't wrong because it is 3.5GB + 0.5GB so total is still 4GB.
Why not? Practically every game will run above 60 fps on GTX 980 Ti even with settings set to the max.

If the person uses 1080p all the way then it'll last for ages.


Once gone to 4K, slap on another GTX 980 Ti. That'll be sweet.
*
I suppose so if someone wants to remain at 1080p/60 for a very, very long time then the 980Ti is the way to go. But the way I see it, if one is going to get a 980Ti then a 1440p/144 + Gsync monitor combo is the way to go. Games are just so blissful with this setup.
goldfries
post Jul 12 2015, 10:52 PM

40K Club
Group Icon
Forum Admin
44,415 posts

Joined: Jan 2003




Can't disagree with that. GTX 980 Ti handles 1440p 144 G-sync comfortably.
moron88
post Jul 13 2015, 12:15 AM

Getting Started
**
Junior Member
150 posts

Joined: Jun 2011


QUOTE(goldfries @ Jul 12 2015, 10:52 PM)
Can't disagree with that. GTX 980 Ti handles 1440p 144 G-sync comfortably.
*
glad to hear that. biggrin.gif now for a new monitor... any recommendation for 1440p 144 g sync monitor ? cheapest to most exp. TN or IPS
cstkl1
post Jul 13 2015, 01:34 AM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

Some Zotac 980ti Amp Extreme Results

4770k@4.5ghz
Maximus VI Hero
Corsair Vengeance Pro 2400 CL11 2x8gb
Zotac 980ti Amp Extreme Single/Sli - 1468(+64)/4100(+500) @1.156/1.175v ( Stock Voltage/Stock Fan)
Nvidia 353.38 Driver. 3dmark Validation link below screenshots.
Windows 8.1



There was no throttling so didnt bother with pl and fan adjustment.

Will post later oveclock for overvolt results. Max voltage is 1.243v

This post has been edited by cstkl1: Jul 13 2015, 01:37 AM
shikimori
post Jul 13 2015, 05:23 AM

Getting Started
**
Junior Member
236 posts

Joined: Jul 2007
From: Penang


QUOTE(moron88 @ Jul 13 2015, 12:15 AM)
glad to hear that. biggrin.gif now for a new monitor... any recommendation for 1440p 144 g sync monitor ? cheapest to most exp.  TN or IPS
*
How about acer xb270hu rm 2599 quite cheap compared to importing it

Ips , gsync ,144hz

This post has been edited by shikimori: Jul 13 2015, 05:55 AM
Moogle Stiltzkin
post Jul 13 2015, 10:48 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE
But I'm going to be frank with everyone here, the 3.5GB vram thing? It's not an issue. Honestly, it's been blown out of propotion. Stop looking at just numbers and charts, they cannot tell the whole story.


well thats why i read the reviews and they point out that games like mordor when ultra max settings will indeed use up lots of vram as high as near 6gb.

yes most games aren't as aggressive like that, but it's there, and most noticely for higher resolutions, or newer games like that on ultra settings.

sure if people don't mind lowering setting to not ultra then shouldn't be an issue. but i don't fall under the category of users that like playing with anything other than ultra doh.gif

but ignoring the vram capacity altogether that clearly affects some titles as non issue is clearly wrong way to go about it :/ as some of us would then rather get a card with 6gbvram or more.


so using an example try convincing people get a 4gb vram fiji over say a 980ti with 6gbvram especially at the same price point. thats a tough sell rolleyes.gif

not only do most titles perform slightly better fps for 980ti, but it also has more vram. and according to reviewers the capacity does matter. not me who said this, but professional reviewers go look :}


QUOTE
Concluding

Our product reviews in the past few months and its conclusion are not any different opposed to everything that has happened in the past few days, the product still performans similar to what we have shown you as hey .. it is in fact the same product. The clusterfuck that Nvidia dropped here is simple, they have not informed the media or their customers about the memory partitioning and the challenges they face. Overall you will have a hard time pushing any card over 3.5 GB of graphics memory usage with any game unless you do some freaky stuff. The ones that do pass 3.5 GB mostly are poor console ports or situations where you game in Ultra HD or DSR Ultra HD rendering. In that situation I cannot guarantee that your overall experience will be trouble free, however we have a hard time detecting and replicating the stuttering issues some people have mentioned.


The Bottom line

Utilizing graphics memory after 3.5 GB can result into performance issues as the card needs to manage some really weird stuff in memory, it's nearly load-balancing. But fact remains it seems to be handling that well, it’s hard to detect and replicate oddities. If you unequivocally refuse to accept the situation at hand, you really should return your card and pick a Radeon R9 290X or GeForce GTX 980. However, if you decide to upgrade to a GTX 980, you will be spending more money and thus rewarding Nvidia for it. Until further notice our recommendation on the GeForce GTX 970 stands as it was, for the money it is an excellent performer. But it should have been called a 3.5 GB card with a 512MB L3 GDDR5 cache buffer.



sources
http://www.guru3d.com/news-story/middle-ea...tress-test.html

http://wccftech.com/shadow-of-mordor-ultra...xture-6gb-vram/

http://www.overclock3d.net/articles/gpu_di...s_6gb_of_vram/1


PS: that said, i seriously doubt blizzard games will ever be pushing the gpu much rolleyes.gif not holding my breath for legacy of the void. i suspect it's just gonna be as gpu intensive as the old sc2. it's only games like mordor, crysis that they push the limits.


This post has been edited by Moogle Stiltzkin: Jul 13 2015, 11:26 AM
goldfries
post Jul 13 2015, 11:07 AM

40K Club
Group Icon
Forum Admin
44,415 posts

Joined: Jan 2003




3.5 GB vs 4GB - very little difference.
SSJBen
post Jul 13 2015, 03:06 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Moogle Stiltzkin @ Jul 13 2015, 10:48 AM)
well thats why i read the reviews and they point out that games like mordor when ultra max settings will indeed use up lots of vram as high as near 6gb.

yes most games aren't as aggressive like that, but it's there, and most noticely for higher resolutions, or newer games like that on ultra settings.

sure if people don't mind lowering setting to not ultra then shouldn't be an issue. but i don't fall under the category of users that like playing with anything other than ultra doh.gif

but ignoring the vram capacity altogether that clearly affects some titles as non issue is clearly wrong way to go about it :/ as some of us would then rather get a card with 6gbvram or more.
so using an example try convincing people get a 4gb vram fiji over say a 980ti with 6gbvram especially at the same price point. thats a tough sell rolleyes.gif

not only do most titles perform slightly better fps for 980ti, but it also has more vram. and according to reviewers the capacity does matter. not me who said this, but professional reviewers go look :}
sources
http://www.guru3d.com/news-story/middle-ea...tress-test.html

http://wccftech.com/shadow-of-mordor-ultra...xture-6gb-vram/

http://www.overclock3d.net/articles/gpu_di...s_6gb_of_vram/1
PS: that said, i seriously doubt blizzard games will ever be pushing the gpu much  rolleyes.gif  not holding my breath for legacy of the void. i suspect it's just gonna be as gpu intensive as the old sc2. it's only games like mordor, crysis that they push the limits.
*
EXCEPT, there are cases where High and Ultra are damn near zero in image quality loss.

SoM comparison Ultra vs High textures:

» Click to show Spoiler - click again to hide... «



I'm also the type of gamer who wants to max everything out, but you know... sometimes, you just need to be sensible. Why select a setting for near-zero increase in quality when performance takes a significant hit? Where's the sense in that? laugh.gif

Understand that the 970 was NEVER advertised as a card for 4k resolutions. It was always a 1080p/60fps card first and foremost, and it will remain that way until it gets phased out.

This post has been edited by SSJBen: Jul 13 2015, 03:07 PM
SUScrash123
post Jul 13 2015, 03:52 PM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
official benchmark from skylake cpu make me happy..there is no significant improvement in performance unless u guys wanna use igpu..
so no need to upgrade for new mobo/ddr4/cpu rclxms.gif maybe wait for "tick" to fully utilized 14nm cpu

user posted image

This post has been edited by crash123: Jul 13 2015, 03:53 PM
SSJBen
post Jul 13 2015, 03:58 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


And... the fact that it's going to cost $400 but still using the same glue+TIM method... lol, yeah way to go Intel.

Anyways that's offtopic.
reconvision
post Jul 13 2015, 09:14 PM

New Member
*
Junior Member
40 posts

Joined: Dec 2013
anyone bought 980ti custom cooler and able to redeem the arkham knight code?
After i key in all the info, it proceed with my card is not eligible for it. btw is asus strix 980ti.
cstkl1
post Jul 13 2015, 09:50 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(reconvision @ Jul 13 2015, 09:14 PM)
anyone bought 980ti custom cooler and able to redeem the arkham knight code?
After i key in all the info, it proceed with my card is not eligible for it. btw is asus strix 980ti.
*
Its from steam btw. Bro use one np. Second one just keeping it atm.
yaphong
post Jul 13 2015, 11:58 PM

On my way
****
Senior Member
659 posts

Joined: Apr 2005


Do you think is it reasonable to sell my ASUS 980 Strix (used) at RM2k?
reconvision
post Jul 14 2015, 12:08 AM

New Member
*
Junior Member
40 posts

Joined: Dec 2013
QUOTE(cstkl1 @ Jul 13 2015, 09:50 PM)
Its from steam btw. Bro use one np. Second one just keeping it atm.
*
Bro, wat do you mean by np?
cstkl1
post Jul 14 2015, 05:26 AM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(reconvision @ Jul 14 2015, 12:08 AM)
Bro, wat do you mean by np?
*
No problem as it works. No issue.
Moogle Stiltzkin
post Jul 14 2015, 05:40 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
oo.,...ooo

http://www.tweaktown.com/news/46420/amd-pr...idia/index.html


amd's gonna squeeze nvidia's supply yawn.gif hope the prices don't rise....
SUSTheHitman47
post Jul 14 2015, 05:48 AM

Nuke
******
Senior Member
1,053 posts

Joined: Sep 2009
From: In Your Mind



QUOTE(Moogle Stiltzkin @ Jul 14 2015, 05:40 AM)
oo.,...ooo

http://www.tweaktown.com/news/46420/amd-pr...idia/index.html
amd's gonna squeeze nvidia's supply yawn.gif hope the prices don't rise....
*
"priority access", looks like nvidia getting a taste of your own medicine. tongue.gif
Moogle Stiltzkin
post Jul 14 2015, 09:21 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(TheHitman47 @ Jul 14 2015, 05:48 AM)
"priority access", looks like nvidia getting a taste of your own medicine.  tongue.gif
*
well thats what happened when they placed the wrong bet on hmc :/

we thought those fiji low stock was bad. now with rumors that hbm2 will also be low on supply, and amd going to get priority. thats gonna suck for nvidia doh.gif

i'm wondering though should i get a gp100 or a gp104 ? not really sure whats the difference other than the former will come first. will it follow the same strategy that the higher end card comes out first ? cause i rather get that card if thats the case doh.gif

Is my Intel 3770 ivy bridge going to be enough :/ ? don't really feel like upgrading that until cannonlake. Also wondering how the amd zen will turn out, whether it will rock the boat in the cpu market kek.
reconvision
post Jul 14 2015, 10:05 AM

New Member
*
Junior Member
40 posts

Joined: Dec 2013
QUOTE(cstkl1 @ Jul 14 2015, 05:26 AM)
No problem as it works. No issue.
*
i tried used one and still give me error lol. i will just try ask asus malaysia for help. thanks btw.
cstkl1
post Jul 14 2015, 10:15 AM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(Moogle Stiltzkin @ Jul 14 2015, 05:40 AM)
oo.,...ooo

http://www.tweaktown.com/news/46420/amd-pr...idia/index.html
amd's gonna squeeze nvidia's supply yawn.gif hope the prices don't rise....
*
Squeeze how?? Amd with credit terms n nvidia with cash terms??
Its just will backfire on amd. Just on sales/order nvidia will exceed amd.

Minecrafter
post Jul 14 2015, 11:39 AM

ROCK N ROLL STAR
*******
Senior Member
5,043 posts

Joined: Aug 2013
From: Putrajaya


QUOTE(Moogle Stiltzkin @ Jul 14 2015, 09:21 AM)
well thats what happened when they placed the wrong bet on hmc :/

we thought those fiji low stock was bad. now with rumors that hbm2 will also be low on supply, and amd going to get priority. thats gonna suck for nvidia doh.gif

i'm wondering though should i get a gp100 or a gp104 ? not really sure whats the difference other than the former will come first. will it follow the same strategy that the higher end card comes out first ? cause i rather get that card if thats the case doh.gif

Is my Intel 3770 ivy bridge going to be enough :/ ? don't really feel like upgrading that until cannonlake. Also wondering how the amd zen will turn out, whether it will rock the boat in the cpu market kek.
*
i7 3770 is still going strong after what?3 years?CPU upgrades doesn't look that big atm..should be enough.

SUSTheHitman47
post Jul 14 2015, 12:12 PM

Nuke
******
Senior Member
1,053 posts

Joined: Sep 2009
From: In Your Mind



QUOTE(Moogle Stiltzkin @ Jul 14 2015, 09:21 AM)
well thats what happened when they placed the wrong bet on hmc :/

we thought those fiji low stock was bad. now with rumors that hbm2 will also be low on supply, and amd going to get priority. thats gonna suck for nvidia doh.gif

i'm wondering though should i get a gp100 or a gp104 ? not really sure whats the difference other than the former will come first. will it follow the same strategy that the higher end card comes out first ? cause i rather get that card if thats the case doh.gif

Is my Intel 3770 ivy bridge going to be enough :/ ? don't really feel like upgrading that until cannonlake. Also wondering how the amd zen will turn out, whether it will rock the boat in the cpu market kek.
*
cpu didnt not aged as easily as gpu.
i hve confidence it can last more than 2 generation..at least
TSskylinelover
post Jul 14 2015, 12:35 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(Moogle Stiltzkin @ Jul 14 2015, 09:21 AM)
Is my Intel 3770 ivy bridge going to be enough :/ ? don't really feel like upgrading that until cannonlake. Also wondering how the amd zen will turn out, whether it will rock the boat in the cpu market kek.
*
QUOTE(Minecrafter @ Jul 14 2015, 11:39 AM)
i7 3770 is still going strong after what?3 years?CPU upgrades doesn't look that big atm..should be enough.
*
Haha this can tahan till next fifa world cup laugh.gif rclxms.gif
TSskylinelover
post Jul 14 2015, 12:37 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(yaphong @ Jul 13 2015, 11:58 PM)
Do you think is it reasonable to sell my ASUS 980 Strix (used) at RM2k?
*
1.8k maybe people interested icon_idea.gif icon_idea.gif
JohnLai
post Jul 14 2015, 12:54 PM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
QUOTE(cstkl1 @ Jul 14 2015, 10:15 AM)
Squeeze how?? Amd with credit terms n nvidia with cash terms??
Its just will backfire on amd. Just on sales/order nvidia will exceed amd.
*
Indeed, by the time AMD exclusivity with Hynix is over, Nvidia is the one enjoying the cost saving due to more mature HBM2 yield.
yaphong
post Jul 14 2015, 02:12 PM

On my way
****
Senior Member
659 posts

Joined: Apr 2005


QUOTE(skylinelover @ Jul 14 2015, 12:37 PM)
1.8k maybe people interested icon_idea.gif icon_idea.gif
*
Hahaha even at 2k I would need to fork additional RM1.4k to get back ASUS 980 TI Strix (comparing apple to apple) for 20% performance increase... doesn't seem to worth it... hmm.gif

This post has been edited by yaphong: Jul 14 2015, 02:12 PM
llk
post Jul 14 2015, 03:25 PM

Look at all my stars!!
*******
Senior Member
4,157 posts

Joined: Jan 2003
From: KL
QUOTE(yaphong @ Jul 14 2015, 02:12 PM)
Hahaha even at 2k I would need to fork additional RM1.4k to get back ASUS 980 TI Strix (comparing apple to apple) for 20% performance increase... doesn't seem to worth it...  hmm.gif
*
Someone is selling extremely low price for the Strix 980, brand new somemore sweat.gif

https://forum.lowyat.net/topic/3648605


yaphong
post Jul 14 2015, 03:40 PM

On my way
****
Senior Member
659 posts

Joined: Apr 2005


QUOTE(llk @ Jul 14 2015, 03:25 PM)
Someone is selling extremely low price for the Strix 980, brand new somemore  sweat.gif

https://forum.lowyat.net/topic/3648605
*
Haha too bad then. I see quite a few only trying to sell at 2.1k but no takers... I guess I am gonna stick with my 980 then.
eugene88
post Jul 15 2015, 08:14 AM

Look at all my stars!!
*******
Senior Member
2,176 posts

Joined: Sep 2010


Is it worth it to upgrade to a 970 from a R9 280X for less heat and more performance?
Cause my tiny room can feel the heat already when I started playing games on my PC

moron88
post Jul 15 2015, 10:51 AM

Getting Started
**
Junior Member
150 posts

Joined: Jun 2011


QUOTE(eugene88 @ Jul 15 2015, 08:14 AM)
Is it worth it to upgrade to a 970 from a R9 280X for less heat and more performance?
Cause my tiny room can feel the heat already when I started playing games on my PC
*
Isnt that more like a downgrade instead of an upgrade? or Did u type wrongly. What is ur PSU, AMD cards are like some hungry power beast.
Minecrafter
post Jul 15 2015, 10:59 AM

ROCK N ROLL STAR
*******
Senior Member
5,043 posts

Joined: Aug 2013
From: Putrajaya


QUOTE(eugene88 @ Jul 15 2015, 08:14 AM)
Is it worth it to upgrade to a 970 from a R9 280X for less heat and more performance?
Cause my tiny room can feel the heat already when I started playing games on my PC
*
If you feel like you need more horsepower,why not?If it's enough,no need to change.

QUOTE(moron88 @ Jul 15 2015, 10:51 AM)
Isnt that more like a downgrade instead of an upgrade? or Did u type wrongly. What is ur PSU, AMD cards are like some hungry power beast.
*
From 280X to GTX970. tongue.gif
eugene88
post Jul 15 2015, 11:04 AM

Look at all my stars!!
*******
Senior Member
2,176 posts

Joined: Sep 2010


QUOTE(moron88 @ Jul 15 2015, 10:51 AM)
Isnt that more like a downgrade instead of an upgrade? or Did u type wrongly. What is ur PSU, AMD cards are like some hungry power beast.
*
Downgrade?
850W Bronze
cstkl1
post Jul 15 2015, 11:20 AM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(eugene88 @ Jul 15 2015, 08:14 AM)
Is it worth it to upgrade to a 970 from a R9 280X for less heat and more performance?
Cause my tiny room can feel the heat already when I started playing games on my PC
*
QUOTE(eugene88 @ Jul 15 2015, 11:04 AM)
Downgrade?
850W Bronze
*
Go for it.
970 is a good card.

moron88
post Jul 15 2015, 11:22 AM

Getting Started
**
Junior Member
150 posts

Joined: Jun 2011


OH shit see wrongly. @@ my bad... hahaha abit sleepy because everywhere started holiday mood already. nothing to do in office.
TSskylinelover
post Jul 15 2015, 12:38 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(moron88 @ Jul 15 2015, 11:22 AM)
OH shit see wrongly. @@ my bad... hahaha abit sleepy because everywhere started holiday mood already. nothing to do in office.
*
Haha plus raya blues also dang
SSJBen
post Jul 15 2015, 05:45 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


Looks like MSI 980Ti Gaming is in the local market now. RM3339 RSP. lol.

On the other hand, the EVGA 980Ti Classy was sold out within 2 hours on Newegg. doh.gif
Didn't even have a chance to add into cart.
TSskylinelover
post Jul 15 2015, 08:38 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
haha told ya EVGA is the best ever laugh.gif rclxms.gif
Moogle Stiltzkin
post Jul 15 2015, 09:12 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(SSJBen @ Jul 15 2015, 05:45 PM)
Looks like MSI 980Ti Gaming is in the local market now. RM3339 RSP. lol.

On the other hand, the EVGA 980Ti Classy was sold out within 2 hours on Newegg. doh.gif
Didn't even have a chance to add into cart.
*
does that come with water cooling block ? hmm.gif

nm found it

QUOTE
The EVGA GeForce GTX 980 Ti Hydro Copper is designed for watercooling enthusiasts. The Hydro Copper waterblock is a full-cover block that spans the entire length of the graphics card, complete with a swappable inlet/outlet for compatibility with custom watercooling solutions.

It is also factory overclocked, with a 1140MHz base speed and 1228MHz boost speed.

http://www.evga.com/Products/Product.aspx?pn=06G-P4-4999-KR

799.99 USD = 3,043.20 MYR


even with gst is only 3225.792, so why is the msi more expensive hmm.gif shipping costs?

This post has been edited by Moogle Stiltzkin: Jul 15 2015, 09:35 PM
llk
post Jul 15 2015, 10:10 PM

Look at all my stars!!
*******
Senior Member
4,157 posts

Joined: Jan 2003
From: KL
QUOTE(Moogle Stiltzkin @ Jul 15 2015, 09:12 PM)
does that come with water cooling block ?  hmm.gif

nm found it
http://www.evga.com/Products/Product.aspx?pn=06G-P4-4999-KR

799.99 USD = 3,043.20 MYR
even with gst is only 3225.792, so why is the msi more expensive  hmm.gif  shipping costs?
*
Basically EVGA Hydro is a reference card with EK waterblock whereby MSI Gaming is fully custom non reference card
SSJBen
post Jul 15 2015, 10:30 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Moogle Stiltzkin @ Jul 15 2015, 09:12 PM)
does that come with water cooling block ?  hmm.gif

nm found it
http://www.evga.com/Products/Product.aspx?pn=06G-P4-4999-KR

799.99 USD = 3,043.20 MYR
even with gst is only 3225.792, so why is the msi more expensive  hmm.gif  shipping costs?
*
Retailers don't need to earn meh? tongue.gif

Moogle Stiltzkin
post Jul 15 2015, 10:37 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(SSJBen @ Jul 15 2015, 10:30 PM)
Retailers don't need to earn meh? tongue.gif
*
no cause i want free stuff *snicker snicker rolleyes.gif
user posted image



QUOTE(llk @ Jul 15 2015, 10:10 PM)
Basically EVGA Hydro is a reference card with EK waterblock whereby MSI Gaming is fully custom non reference card
ah i see. ty for the clarification notworthy.gif

This post has been edited by Moogle Stiltzkin: Jul 15 2015, 10:39 PM
Loki[D.d.G]
post Jul 15 2015, 11:51 PM

Quis custodiet ipsos custodes
*******
Senior Member
3,648 posts

Joined: Sep 2009
From: Twixt nether and ether
QUOTE(SSJBen @ Jul 15 2015, 05:45 PM)
On the other hand, the EVGA 980Ti Classy was sold out within 2 hours on Newegg. doh.gif
Didn't even have a chance to add into cart.
*
I remember getting my GTX 970 on Newegg 24 hours after it was released... Got lucky when some bloke cancelled his order and I managed to snag the card in his place biggrin.gif

This post has been edited by Loki[D.d.G]: Jul 15 2015, 11:52 PM
eugene88
post Jul 16 2015, 01:50 PM

Look at all my stars!!
*******
Senior Member
2,176 posts

Joined: Sep 2010


QUOTE(llk @ Jul 14 2015, 03:25 PM)
Someone is selling extremely low price for the Strix 980, brand new somemore  sweat.gif

https://forum.lowyat.net/topic/3648605
*
Since I'm getting a 970 around RM1500-RM1700
Should I get this one instead? laugh.gif
sheva78
post Jul 16 2015, 01:52 PM

New Member
*
Junior Member
15 posts

Joined: Feb 2008
From: Butterworth, Penang


Guys, just bought a Noops branded displayport cable to use with my GTX 960 but all i'm getting is a screen with no input signal written on it. Am i doing something wrong or did i just bought a bad cable. Can you guys recommend me another brand for the displayport cable. One that doesn't cost to much preferably less than rm150. Currently using HDMI input. My monitor is AOC I2369VM if it helps.
llk
post Jul 16 2015, 02:00 PM

Look at all my stars!!
*******
Senior Member
4,157 posts

Joined: Jan 2003
From: KL
QUOTE(eugene88 @ Jul 16 2015, 01:50 PM)
Since I'm getting a 970 around RM1500-RM1700
Should I get this one instead?  laugh.gif
*
Brand new 970 should be around RM1500, yesterday saw someone selling his ASUS 970 strix at RM1000 (not sure still available or not). If u prefer brand new then better get that gtx980 strix at RM1.7K (pls check whether it is local warranty)
eugene88
post Jul 16 2015, 02:12 PM

Look at all my stars!!
*******
Senior Member
2,176 posts

Joined: Sep 2010


QUOTE(llk @ Jul 16 2015, 02:00 PM)
Brand new 970 should be around RM1500, yesterday saw someone selling his ASUS 970 strix at RM1000 (not sure still available or not). If u prefer brand new then better get that gtx980 strix at RM1.7K (pls check whether it is local warranty)
*
The 970 Strix no more already
The Gigabyte 970 G1 which I liked around RM16xx
Not fishy considering the price is so low for a brand new 980? sweat.gif
llk
post Jul 16 2015, 02:16 PM

Look at all my stars!!
*******
Senior Member
4,157 posts

Joined: Jan 2003
From: KL
QUOTE(eugene88 @ Jul 16 2015, 02:12 PM)
The 970 Strix no more already
The Gigabyte 970 G1 which I liked around RM16xx
Not fishy considering the price is so low for a brand new 980?  sweat.gif
*
That is why you need to check carefully, possible cod is better if you want to buy.
eugene88
post Jul 16 2015, 02:22 PM

Look at all my stars!!
*******
Senior Member
2,176 posts

Joined: Sep 2010


QUOTE(llk @ Jul 16 2015, 02:16 PM)
That is why you need to check carefully, possible cod is better if you want to buy.
*
He said local warranty and need to register at Asus warranty
Can cod
Can anything go wrong?
llk
post Jul 16 2015, 02:36 PM

Look at all my stars!!
*******
Senior Member
4,157 posts

Joined: Jan 2003
From: KL
QUOTE(eugene88 @ Jul 16 2015, 02:22 PM)
He said local warranty and need to register at Asus warranty
Can cod
Can anything go wrong?
*
If got local warranty then should be ok, but must get receipt, i heard Asus Malaysia won't accept walk in customer for RMA case (to be confirm).
eugene88
post Jul 16 2015, 03:02 PM

Look at all my stars!!
*******
Senior Member
2,176 posts

Joined: Sep 2010


QUOTE(llk @ Jul 16 2015, 02:36 PM)
If got local warranty then should be ok, but must get receipt, i heard Asus Malaysia won't accept walk in customer for RMA case (to be confirm).
*
He said no receipt but will create own invoice hmm.gif
llk
post Jul 16 2015, 03:04 PM

Look at all my stars!!
*******
Senior Member
4,157 posts

Joined: Jan 2003
From: KL
QUOTE(eugene88 @ Jul 16 2015, 03:02 PM)
He said no receipt but will create own invoice  hmm.gif
*
He will create for you? If brand new sealed item shouldn't be worried too much
eugene88
post Jul 16 2015, 03:15 PM

Look at all my stars!!
*******
Senior Member
2,176 posts

Joined: Sep 2010


QUOTE(llk @ Jul 16 2015, 03:04 PM)
He will create for you? If brand new sealed item shouldn't be worried too much
*
Yes he claimed that he will create for me as he is not an authorized dealer
He will also provide 1 week to 1 to 1 exchange
llk
post Jul 16 2015, 03:17 PM

Look at all my stars!!
*******
Senior Member
4,157 posts

Joined: Jan 2003
From: KL
QUOTE(eugene88 @ Jul 16 2015, 03:15 PM)
Yes he claimed that he will create for me as he is not an authorized dealer
He will also provide 1 week to 1 to 1 exchange
*
Then up to you to decide, the price is quite attractive to be honest
eugene88
post Jul 16 2015, 03:21 PM

Look at all my stars!!
*******
Senior Member
2,176 posts

Joined: Sep 2010


QUOTE(llk @ Jul 16 2015, 03:17 PM)
Then up to you to decide, the price is quite attractive to be honest
*
Yeah really attractive, fingers crossed hopefully nothing will go wrong
llk
post Jul 16 2015, 03:23 PM

Look at all my stars!!
*******
Senior Member
4,157 posts

Joined: Jan 2003
From: KL
QUOTE(eugene88 @ Jul 16 2015, 03:21 PM)
Yeah really attractive, fingers crossed hopefully nothing will go wrong
*
rclxms.gif rclxms.gif rclxms.gif
heerosakura
post Jul 16 2015, 03:44 PM

Getting Started
**
Junior Member
175 posts

Joined: Aug 2012
thinking will 980 reference price exp than non reference? tongue.gif
SUSHuman10
post Jul 16 2015, 04:35 PM

Look at all my stars!!
*******
Senior Member
6,774 posts

Joined: Nov 2010
QUOTE(sheva78 @ Jul 16 2015, 01:52 PM)
Guys, just bought a Noops branded displayport cable to use with my GTX 960 but all i'm getting is a screen with no input signal written on it. Am i doing something wrong or did i just bought a bad cable. Can you guys recommend me another brand for the displayport cable. One that doesn't cost to much preferably less than rm150. Currently using HDMI input. My monitor is AOC I2369VM if it helps.
*
You sure the ports on both GPU and monitor are functional?

You changed to the correct input on monitor?

This post has been edited by Human10: Jul 16 2015, 04:40 PM
sheva78
post Jul 16 2015, 05:17 PM

New Member
*
Junior Member
15 posts

Joined: Feb 2008
From: Butterworth, Penang


QUOTE(Human10 @ Jul 16 2015, 04:35 PM)
You sure the ports on both GPU and monitor are functional?

You changed to the correct input on monitor?
*
Yup, even tried it on another graphic card. Still won't work.
SUSHuman10
post Jul 16 2015, 05:34 PM

Look at all my stars!!
*******
Senior Member
6,774 posts

Joined: Nov 2010
QUOTE(sheva78 @ Jul 16 2015, 05:17 PM)
Yup, even tried it on another graphic card. Still won't work.
*
Try return it and change another cable then see how.

SSJBen
post Jul 16 2015, 05:35 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


Ready?
Kingpin 980Ti.


» Click to show Spoiler - click again to hide... «


brows.gif
SUSHuman10
post Jul 16 2015, 05:36 PM

Look at all my stars!!
*******
Senior Member
6,774 posts

Joined: Nov 2010
QUOTE(SSJBen @ Jul 16 2015, 05:35 PM)
Ready?
Kingpin 980Ti.
» Click to show Spoiler - click again to hide... «


brows.gif
*
Dat 8+8+6pins connectors shocking.gif

The stock cooler looks a little underwhelming if the card really drew over 400W of power... sweat.gif

This post has been edited by Human10: Jul 16 2015, 05:39 PM
TSskylinelover
post Jul 16 2015, 05:44 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
Ho yeah triple kill mang drool.gif rclxm9.gif
SSJBen
post Jul 16 2015, 06:05 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Human10 @ Jul 16 2015, 05:36 PM)
Dat 8+8+6pins connectors shocking.gif

The stock cooler looks a little underwhelming if the card really drew over 400W of power... sweat.gif
*
Honestly, it's made to run on water. It just needs that full copper fins to justify the price-tag.
reconvision
post Jul 16 2015, 09:55 PM

New Member
*
Junior Member
40 posts

Joined: Dec 2013
Anyone here using PSU that is consider quiet during heavy gaming? my 980ti is causing my seasonic m12ii to sound so loud Is getting very unpleasant.
llk
post Jul 16 2015, 09:58 PM

Look at all my stars!!
*******
Senior Member
4,157 posts

Joined: Jan 2003
From: KL
QUOTE(reconvision @ Jul 16 2015, 09:55 PM)
Anyone here using PSU that is consider quiet during heavy gaming? my 980ti is causing my seasonic m12ii to sound so loud Is getting very unpleasant.
*
I'm using Corsair Ax1200i don't have any issue
reconvision
post Jul 16 2015, 10:20 PM

New Member
*
Junior Member
40 posts

Joined: Dec 2013
QUOTE(llk @ Jul 16 2015, 09:58 PM)
I'm using Corsair Ax1200i don't have any issue
*
quiet even on heavy load? how long u have been using it? Coz i read some comment on amazon tat the Ax 860i dead after few months of used Usually less than a year.
llk
post Jul 16 2015, 10:27 PM

Look at all my stars!!
*******
Senior Member
4,157 posts

Joined: Jan 2003
From: KL
QUOTE(reconvision @ Jul 16 2015, 10:20 PM)
quiet even on heavy load? how long u have been using it? Coz i read some comment on amazon tat the Ax 860i dead after few months of used Usually less than a year.
*
Previously I was using AX860i for about 2years and sold it out 2mths ago, bought this AX1200i till now non of them give me any problem, hardware is hard to predict, as long as it covered by warranty then should be fine.
reconvision
post Jul 16 2015, 11:03 PM

New Member
*
Junior Member
40 posts

Joined: Dec 2013
QUOTE(llk @ Jul 16 2015, 10:27 PM)
Previously I was using AX860i for about 2years and sold it out 2mths ago, bought this AX1200i till now non of them give me any problem, hardware is hard to predict, as long as it covered by warranty then should be fine.
*
Thx for the info. Im looking at superflower leadex gold and ax series.
llk
post Jul 16 2015, 11:06 PM

Look at all my stars!!
*******
Senior Member
4,157 posts

Joined: Jan 2003
From: KL
QUOTE(reconvision @ Jul 16 2015, 11:03 PM)
Thx for the info. Im looking at superflower leadex gold and ax series.
*
Seasonic X and P series also very good, you may check it out as well
TSskylinelover
post Jul 17 2015, 05:55 AM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(reconvision @ Jul 16 2015, 09:55 PM)
Anyone here using PSU that is consider quiet during heavy gaming? my 980ti is causing my seasonic m12ii to sound so loud Is getting very unpleasant.
*
Haha same here that i need 2 wake up 2 off the pc halfway through sleep laugh.gif doh.gif

Normally left pc on 2 check late night football scores with extra downloads
Minecrafter
post Jul 17 2015, 06:22 AM

ROCK N ROLL STAR
*******
Senior Member
5,043 posts

Joined: Aug 2013
From: Putrajaya


QUOTE(SSJBen @ Jul 16 2015, 05:35 PM)
Ready?
Kingpin 980Ti.
» Click to show Spoiler - click again to hide... «


brows.gif
*
drool.gif drool.gif drool.gif
Loki[D.d.G]
post Jul 17 2015, 08:28 AM

Quis custodiet ipsos custodes
*******
Senior Member
3,648 posts

Joined: Sep 2009
From: Twixt nether and ether
QUOTE(reconvision @ Jul 16 2015, 09:55 PM)
Anyone here using PSU that is consider quiet during heavy gaming? my 980ti is causing my seasonic m12ii to sound so loud Is getting very unpleasant.
*
I'm currently using the EVGA SuperNOVA 750 G2 and I can't hear it over the sound of my Noctua fans smile.gif
reconvision
post Jul 17 2015, 08:43 PM

New Member
*
Junior Member
40 posts

Joined: Dec 2013
QUOTE(skylinelover @ Jul 17 2015, 05:55 AM)
Haha same here that i need 2 wake up 2 off the pc halfway through sleep laugh.gif doh.gif

Normally left pc on 2 check late night football scores with extra downloads
*
Haha ya really have to switch it off. I rather sacrifice my dowloads then my sleep lol.
reconvision
post Jul 17 2015, 08:46 PM

New Member
*
Junior Member
40 posts

Joined: Dec 2013
QUOTE(LokiD.d.G @ Jul 17 2015, 08:28 AM)
I'm currently using the EVGA SuperNOVA 750 G2 and I can't hear it over the sound of my Noctua fans  smile.gif
*
Where you get Evga PSU?? The supernova is actually in my list and i heard is the rebrand of superflower leadex platinum which is great. If i can get it in local i'm sure im getting the evga supernova.
Loki[D.d.G]
post Jul 17 2015, 11:18 PM

Quis custodiet ipsos custodes
*******
Senior Member
3,648 posts

Joined: Sep 2009
From: Twixt nether and ether
QUOTE(reconvision @ Jul 17 2015, 08:46 PM)
Where you get Evga PSU?? The supernova is actually in my list and i heard is the rebrand of superflower leadex platinum  which is great. If i can get it in local i'm sure im getting the evga supernova.
*
Off Amazon. It's the desktop I use back in the US. Didn't realize it wasn't available in the Malaysian market.

And yes, it's a top quality product which is backed by EVGA's excellent warranty service for ten years
Demonic Wrath
post Jul 19 2015, 08:32 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(reconvision @ Jul 16 2015, 09:55 PM)
Anyone here using PSU that is consider quiet during heavy gaming? my 980ti is causing my seasonic m12ii to sound so loud Is getting very unpleasant.
*
Yes..m12ii will be loud when under load. I used it (750w) for my GTX970 before, only 350w load also consider loud. Finally changed to Enermax Revolution87 750w, now ok already.
SUScrash123
post Jul 20 2015, 10:53 AM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
QUOTE(reconvision @ Jul 16 2015, 09:55 PM)
Anyone here using PSU that is consider quiet during heavy gaming? my 980ti is causing my seasonic m12ii to sound so loud Is getting very unpleasant.
*
Try used CM v850..i buy it around 500 last year and really worth it..really quite..fully modular
terradrive
post Jul 20 2015, 11:06 AM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


QUOTE(reconvision @ Jul 16 2015, 09:55 PM)
Anyone here using PSU that is consider quiet during heavy gaming? my 980ti is causing my seasonic m12ii to sound so loud Is getting very unpleasant.
*
I'm using Seasonic X-850, when i oced my r9 290 and it sucked a lot of power I can't hear the fans on my seasonic thumbup.gif

Just use those highest rated PSUs in terms of quality, such as Seasonic X series, Corsair AX series, Cooler Master V series etc.
Minecrafter
post Jul 20 2015, 03:07 PM

ROCK N ROLL STAR
*******
Senior Member
5,043 posts

Joined: Aug 2013
From: Putrajaya


IdealTech just updated their price list,and..
user posted image

As expected,on the high side. tongue.gif
SSJBen
post Jul 20 2015, 03:11 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


Still cheaper than Asus Strix and MSI Gaming lol...
TSskylinelover
post Jul 20 2015, 05:08 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(Minecrafter @ Jul 20 2015, 03:07 PM)
IdealTech just updated their price list,and..
user posted image

As expected,on the high side. tongue.gif
*
I can get cheaper with pika123 rclxms.gif
terradrive
post Jul 20 2015, 08:53 PM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


QUOTE(SSJBen @ Jul 16 2015, 05:35 PM)
Ready?
Kingpin 980Ti.
» Click to show Spoiler - click again to hide... «


brows.gif
*
Is copper really good?
Copper is good at heat conductance between metal but they are worse than aluminum for transfering heat via convection from metal to air.
marfccy
post Jul 21 2015, 02:22 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(Minecrafter @ Jul 20 2015, 03:07 PM)
IdealTech just updated their price list,and..
user posted image

As expected,on the high side. tongue.gif
*
on the bright side, you get easier warranty dealing if shit happens
cstkl1
post Jul 21 2015, 06:02 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

980ti kingpin
The prices are based on asic.

Asic 80+ usd 1049.
SSJBen
post Jul 21 2015, 06:24 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(cstkl1 @ Jul 21 2015, 06:02 PM)
980ti kingpin
The prices are based on asic.

Asic 80+ usd 1049.
*
And I bet a lot of people are going to start falling for this whole ASIC quality market again... lol.

Then again, siliconlottery.com has already proven there is a market for it.
cstkl1
post Jul 21 2015, 09:56 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(SSJBen @ Jul 21 2015, 06:24 PM)
And I bet a lot of people are going to start falling for this whole ASIC quality market again... lol.

Then again, siliconlottery.com has already proven there is a market for it.
*
Dont always assume what u learn from one generation of cards is true.

Already told u. Tx n ti is all about the asic from day one.
Even kingpin stated it. I stated ir from day one.
1.24v under 65c
60%-1405
65-1456
70-1506/7
75-1558
80-1608
Benching u can push 50-75mhz more

Secondary is the pll between vram mem controller with the gpu
Third is power delivery n the ability of the gpu to scale with voltage subject to pll voltage as well.
Kingpin cards suppose to fix second n third.

Anyone who clock ure cpu...the second part is common knowledge.

Switching frequency is important also between both vram n gpu. It keeps scaling in check

I nvr dove into previous gen asic but i am gonna assume some domt understand how important 2&3 is. Only cards like hof/matrix/kingpin/lightning u have some control on 2&3.

Reference card luck of the draw on all three. But generally if u have a card that high asic with high vram clock.. Ure good.



reconvision
post Jul 21 2015, 10:17 PM

New Member
*
Junior Member
40 posts

Joined: Dec 2013
QUOTE(LokiD.d.G @ Jul 17 2015, 11:18 PM)
Off Amazon. It's the desktop I use back in the US. Didn't realize it wasn't available in the Malaysian market.

And yes, it's a top quality product which is backed by EVGA's excellent warranty service for ten years
*
Thanks for the info.
reconvision
post Jul 21 2015, 10:18 PM

New Member
*
Junior Member
40 posts

Joined: Dec 2013
Thanks everyone who reply my question. appreciated. BTW, I'm overclocking my gpu now can I only increase the core clock and left the memory clock untouch?
DoX
post Jul 22 2015, 01:40 AM

Getting Started
**
Junior Member
127 posts

Joined: Mar 2014


wrong post.

This post has been edited by DoX: Jul 22 2015, 02:11 AM
pspslim007
post Jul 22 2015, 06:59 AM

Enthusiast
*****
Senior Member
701 posts

Joined: Apr 2010
From: Malaysia


hi guys, anyone tried to clock 780ti ? whats the optimazation clock? mind sharing thanks.
Minecrafter
post Jul 22 2015, 03:36 PM

ROCK N ROLL STAR
*******
Senior Member
5,043 posts

Joined: Aug 2013
From: Putrajaya


QUOTE(pspslim007 @ Jul 22 2015, 06:59 AM)
hi guys, anyone tried to clock 780ti ? whats the optimazation clock? mind sharing thanks.
*
You'll need to figure it out yourself. tongue.gif Different cards have different headroom(?)to overclock.Like my R7 265,i can increase my core and memory clocks more compared to some reviewers.


SSJBen
post Jul 22 2015, 03:44 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(reconvision @ Jul 21 2015, 10:18 PM)
Thanks everyone who reply my question. appreciated. BTW, I'm overclocking my gpu now can I only increase the core clock and left the memory clock untouch?
*
You can OC both. But do core clock first.

For mem clock, Hynix and Samsung ICs should do 7500mhz pretty easily. Most Samsung chips can reach 8000 and beyond. Don't think I've seen any of those crappy Elpida on 980Ti/Titan X yet (correct me if I'm wrong).
genjo
post Jul 24 2015, 09:53 AM

⭐⭐⭐⭐⭐⭐
******
Senior Member
1,431 posts

Joined: Mar 2009

After some researches i have decided to try nvdia this time and i got myself Palit GTX980 Jetstream.

Able to max out almost all of my games. Just GTAV and Witcher 3 can't max out if want to achieve 60fps
JohnLai
post Jul 24 2015, 10:42 AM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
QUOTE(genjo @ Jul 24 2015, 09:53 AM)
After some researches i have decided to try nvdia this time and i got myself Palit GTX980 Jetstream.

Able to max out almost all of my games. Just GTAV and Witcher 3 can't max out if want to achieve 60fps
*
You play the games at 1080p?

Witcher 3 = you can adjust the hairwork settings with latest 1.07 patch.

Hairworks = on for all
Hairworks AA = reduce to 4X or 2x ( even none AA?)
Hairworks preset = Low
(Rest of setting = all ultra except population set to low, population low means max npc character in town is 75 at a time)

With my GTX 970, I notice when I turned the hairwork preset from low to high, GPU usage increases from 73% to 99% (HWK.AA. at 4x).
I overclocked my GTX970 to base clock of 1321Mhz (boost up to 1500+mhz) and VRAM effective clock of 8000Mhz which technically equivalent to stock GTX 980.
Now, you should know by now witcher 3 cutscene is rendered at 30fps by default (developer decision), go to My Document-->Witcher 3-->open "user" with notepad-->Add MovieUbersampling=false under [Visuals] section and change MovieFramerate=60. Ubersampling kills the framerate hard during cutscene.


As for GTA V = antialiasing set to TXAA 4X (MSAA 4X, turn on TXAA), Extended Distance Scaling sets to 70%, Grass quality set to high (very high and ultra tanks the fps too much), Post FX sets to High (cant see anything different with high vs ultra), Reflection Quality sets to high (ultra merely does extra calculation which doesnt contribute much to visual)
TSskylinelover
post Jul 24 2015, 11:36 AM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
Haha no news on pascal? Suddenly feel 2 quiet here unloke last month. laugh.gif doh.gif
shikimori
post Jul 24 2015, 09:42 PM

Getting Started
**
Junior Member
236 posts

Joined: Jul 2007
From: Penang


went full retard and bought Asus MG279Q cannot wait for XB270hu cry.gif long waiting period (up to 1 month) got Ass Creed Unity and DLC as free game bundle together

user posted image the box kinda retarded as well for missing the Q but its the same model

dunno worth it or not in terms of price . But dayummmm the moment playing BF4 at 144hz feels like playing call of duty , will I miss G-Sync or Freesync ?

I dont know but IPS + 144hz is a whole new experience for me at least .
yaphong
post Jul 24 2015, 09:56 PM

On my way
****
Senior Member
659 posts

Joined: Apr 2005


QUOTE(skylinelover @ Jul 24 2015, 11:36 AM)
Haha no news on pascal? Suddenly feel 2 quiet here unloke last month. laugh.gif doh.gif
*
I think last month was hot due to "Which GTX 980 Ti to buy"...
yaphong
post Jul 24 2015, 10:04 PM

On my way
****
Senior Member
659 posts

Joined: Apr 2005


QUOTE(shikimori @ Jul 24 2015, 09:42 PM)
went full retard and bought Asus MG279Q cannot wait for XB270hu  cry.gif long waiting period (up to 1 month) got Ass Creed Unity and DLC as free game bundle together
the box kinda retarded as well for missing the Q but its the same model

dunno worth it or not in terms of price . But dayummmm the moment playing BF4 at 144hz feels like playing call of duty , will I miss G-Sync or Freesync ?

I dont know but IPS + 144hz is a whole new experience for me at least .
*
How much is this?
shikimori
post Jul 24 2015, 10:11 PM

Getting Started
**
Junior Member
236 posts

Joined: Jul 2007
From: Penang


QUOTE(yaphong @ Jul 24 2015, 10:04 PM)
How much is this?
*
2.5k tongue.gif
marfccy
post Jul 24 2015, 10:16 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(shikimori @ Jul 24 2015, 10:11 PM)
2.5k tongue.gif
*
but you dont have an AMD card doh.gif

how are you gonna test Freesync le
shikimori
post Jul 24 2015, 10:20 PM

Getting Started
**
Junior Member
236 posts

Joined: Jul 2007
From: Penang


QUOTE(marfccy @ Jul 24 2015, 10:16 PM)
but you dont have an AMD card doh.gif

how are you gonna test Freesync le
*
was looking to purchase fury x but the shop keeper said someone goreng it already cb.... so many rich pipul


marfccy
post Jul 24 2015, 10:23 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(shikimori @ Jul 24 2015, 10:20 PM)
was looking to purchase fury x but the shop keeper said someone goreng it already cb.... so many rich pipul
*
xpe, Fury pun boleh laugh.gif

Fury X out of stock worldwide mah
SSJBen
post Jul 24 2015, 10:27 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


Asus needs to hurry the-eff up with the PG279Q already. From announcement at CES, to no news, to suddenly "watch out for news in Fall 2015" to now no news again. doh.gif

It better be good when it comes out... in fact I don't know what's actually keeping them. The PG279Q is basically the same model as the MG279Q, just with Gsync.
Jeez everything is there already.
llk
post Jul 24 2015, 11:05 PM

Look at all my stars!!
*******
Senior Member
4,157 posts

Joined: Jan 2003
From: KL
QUOTE(shikimori @ Jul 24 2015, 10:20 PM)
was looking to purchase fury x but the shop keeper said someone goreng it already cb.... so many rich pipul
*
C-Zone have the Powercolor Fury X @RM2649 but limited stock though
shikimori
post Jul 24 2015, 11:26 PM

Getting Started
**
Junior Member
236 posts

Joined: Jul 2007
From: Penang


QUOTE(llk @ Jul 24 2015, 11:05 PM)
C-Zone have the Powercolor Fury X @RM2649 but limited stock though
*
quite a nice price sad.gif too bad im in penang .

Which one is better freesync or g-sync ? From what I heard freesync suffers from ghosting and limitation for refresh rate for example this mg279q is capped at 35hz-90hz

Gsync is capable of rates that range from 30Hz to 144Hz cry.gif


Taken from Wccftech

QUOTE
FreeSync Pros :
– Easier to integrate into a wider range of monitors due to lack of any additional hardware.
– Significantly less expensive than G-Sync, no licensing fees.
– Enables all the usual monitor features and display outputs.
– Gives users the option of V-Sync on or Off.
FreeSync Cons :
– Currently limited to six graphics cards and six APUs.
FreeSync Compromise (not a pro or a con) :
– Reverts back to the monitor’s maximum fixed refresh rate when the framerate dips below the minimum threshold.

G-Sync Pros :
– Compatible with a wider range of graphics cards.
G-Sync Cons :
– Requires dedicated hardware in the monitor and demands licensing fees.
– Limits monitor features, sound and display output options to DisplayPort.
– Measurably more expensive than FreeSync.
– Currently doesn’t give users the option to disable V-Sync above the maximum refresh rate of the monitor.
G-Sync Compromise (not a pro or a con) :
– Frame duplication extends G-Sync’s functionality below the minimum threshold but causes flickering.



Read more: http://wccftech.com/amd-freesync-nvidia-gs.../#ixzz3gp5KJUp9
clawhammer
post Jul 24 2015, 11:43 PM

///M
Group Icon
VIP
8,788 posts

Joined: Jan 2003
From: Kuala Lumpur




QUOTE(shikimori @ Jul 24 2015, 09:42 PM)
dunno worth it or not in terms of price . But dayummmm the moment playing BF4 at 144hz feels like playing call of duty , will I miss G-Sync or Freesync ?

I dont know but IPS + 144hz is a whole new experience for me at least .
*
If your graphics card can do high FPS without dipping to 40-50FPS and have a 144Hz, forget about GSYNC or Free Sync, I don't think you will miss much smile.gif
goldfries
post Jul 24 2015, 11:51 PM

40K Club
Group Icon
Forum Admin
44,415 posts

Joined: Jan 2003




I have never seen the need for those stuff.

So damn hard just to get the framerate to tear to begin with.
marfccy
post Jul 25 2015, 12:17 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(goldfries @ Jul 24 2015, 11:51 PM)
I have never seen the need for those stuff.

So damn hard just to get the framerate to tear to begin with.
*
its one of those subjective topics i guess

only once i experienced tearing on my monitor, and thats cause playing Dead Space 3 at 100FPS

other games wise, no tearing at all, even with vsync off

odd hmm.gif

This post has been edited by marfccy: Jul 25 2015, 12:18 AM
shikimori
post Jul 25 2015, 12:17 AM

Getting Started
**
Junior Member
236 posts

Joined: Jul 2007
From: Penang


QUOTE(clawhammer @ Jul 24 2015, 11:43 PM)
If your graphics card can do high FPS without dipping to 40-50FPS and have a 144Hz, forget about GSYNC or Free Sync, I don't think you will miss much smile.gif
*
biggrin.gif thanks man that's what I want to hear . Kinda take a gamble not to wait for XB270hu or the GSYNC version of this MG279Q
Dunno whats the fuss about this sync thing

QUOTE(goldfries @ Jul 24 2015, 11:51 PM)
I have never seen the need for those stuff.

So damn hard just to get the framerate to tear to begin with.
*
sad.gif that means no review from you about these stuff ? I tot this suppose to be the best next thing about monitor and graphic card sad.gif
SUSAxeFire
post Jul 25 2015, 12:21 AM

Casual
***
Junior Member
368 posts

Joined: Oct 2012
From: Penang
As long as 60 60 60 is all I care
goldfries
post Jul 25 2015, 12:21 AM

40K Club
Group Icon
Forum Admin
44,415 posts

Joined: Jan 2003




I tested FreeSync monitor, and I couldn't product screen tearing. tongue.gif
clawhammer
post Jul 25 2015, 12:40 AM

///M
Group Icon
VIP
8,788 posts

Joined: Jan 2003
From: Kuala Lumpur




QUOTE(shikimori @ Jul 25 2015, 12:17 AM)
Dunno whats the fuss about this sync thing
Well the GSYNC works in certain scenarios it intended to be. For example if you're on 4K and FPS starts jumping all over in AAA game titles like Witcher 3, Dragon Age Inquisition, the GSYNC helps. Of course in a 2560 x 1440 resolution, it can easily be driven by high end graphics card these days.
SSJBen
post Jul 25 2015, 01:01 AM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


Err... I think there's a little bit of confusion here.

Gsync (or even Freesync) isn't just for preventing tearing. It's also to normalize input lag and reduce frame hitching.
Note I say "normalize", not reduce.
clawhammer
post Jul 25 2015, 01:06 AM

///M
Group Icon
VIP
8,788 posts

Joined: Jan 2003
From: Kuala Lumpur




QUOTE(SSJBen @ Jul 25 2015, 01:01 AM)
Err... I think there's a little bit of confusion here.

Gsync (or even Freesync) isn't just for preventing tearing. It's also to normalize input lag and reduce frame hitching.
Note I say "normalize", not reduce.
*
You're right but the input lag part isn't a big deal if you have a 144Hz screen and able to drive a decent FPS out of it. I have the 4K Acer and ROG Swift in my room now and the 144Hz is certainly a whole new experience altogether biggrin.gif

SSJBen
post Jul 25 2015, 01:14 AM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(clawhammer @ Jul 25 2015, 01:06 AM)
You're right but the input lag part isn't a big deal if you have a 144Hz screen and able to drive a decent FPS out of it. I have the 4K Acer and ROG Swift in my room now and the 144Hz is certainly a whole new experience altogether biggrin.gif
*
I concur. I'm on a 120hz screen, so I know how much better it is than a 60hz experience.

I just hate it when frametimes jump up and down, then the input lag follows. It's annoying.
clawhammer
post Jul 25 2015, 01:27 AM

///M
Group Icon
VIP
8,788 posts

Joined: Jan 2003
From: Kuala Lumpur




QUOTE(SSJBen @ Jul 25 2015, 01:14 AM)
I concur. I'm on a 120hz screen, so I know how much better it is than a 60hz experience.

I just hate it when frametimes jump up and down, then the input lag follows. It's annoying.
*
A 980 Ti would help to fix that FPS jump problem biggrin.gif If needed, 2 pieces would be just nice. I get solid 144Hz on my SLI on stock clock/boost.
TSskylinelover
post Jul 25 2015, 06:33 AM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(AxeFire @ Jul 25 2015, 12:21 AM)
As long as 60 60 60 is all I care
*
Haha same here.

I dont buy the whatever sync crap.

Now i already itching 4k.

Just hoping the pascal flagship can 60fps in 4k in single card.

laugh.gif rclxms.gif
SSJBen
post Jul 25 2015, 02:18 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(skylinelover @ Jul 25 2015, 06:33 AM)
Haha same here.

I dont buy the whatever sync crap.

Now i already itching 4k.

Just hoping the pascal flagship can 60fps in 4k in single card.

laugh.gif rclxms.gif
*
Once again, it's not just about the "sync crap" lol. 60hz is so last gen once you go into 120/144.


And no, Gsync and Freesync is not "crap" either. You've never tried it, you'll never know what you've been missing. It's obviously still overpriced, which is one of the main reasons people are calling them crap. For what it does, I mean really no offense, but you have no idea.

This post has been edited by SSJBen: Jul 25 2015, 02:21 PM
nettelim
post Jul 25 2015, 05:50 PM

New Member
*
Junior Member
6 posts

Joined: May 2006
planning to get gtx970 sli play some AAA games. (unity, witcher3, black ops3)

but this card has 3.5gb and less future proof compare to amd cards. (getting higher resolution monitor and hope able to max out AA).
should I wait for nvidia next generation card that replace gtx970?


clawhammer
post Jul 25 2015, 06:29 PM

///M
Group Icon
VIP
8,788 posts

Joined: Jan 2003
From: Kuala Lumpur




QUOTE(nettelim @ Jul 25 2015, 05:50 PM)
planning to get gtx970 sli play some AAA games. (unity, witcher3, black ops3)

but this card has 3.5gb and less future proof compare to amd cards. (getting higher resolution monitor and hope able to max out AA).
should I wait for nvidia next generation card that replace gtx970?
*
3.5GB would work fine for 1440p and it also depends on the game. BF4 uses lots of VRAM while Witcher 3 uses a lot lesser and produces great graphics. It's all about developers optimizing and doing a great job on their codes.
Instead of 970 SLI, a single 980 Ti would do the job and it's always better to have a single card setup than SLI. You only go SLI if;

1. A single card solution is unable to give you enough juice
2. Cost is a concern; 2 x 970 SLI is cheaper than a 980 Ti
3. You just want the looks of SLI

SLI might (not always necessary) introduces various issues like scaling, stuttering, etc. Unless you plan to buy now and stick with your 970 for 2-3 years, there's no point worrying about "future proof" because eventually you end up upgrading and future cards would come with more VRAM. In technology, the wait never ends because if you keep on waiting, you will never buy anything smile.gif The only reason why 970 becomes cheap is because there's something new and the cycle continues.
SUSHuman10
post Jul 25 2015, 06:40 PM

Look at all my stars!!
*******
Senior Member
6,774 posts

Joined: Nov 2010
QUOTE(nettelim @ Jul 25 2015, 05:50 PM)
planning to get gtx970 sli play some AAA games. (unity, witcher3, black ops3)

but this card has 3.5gb and less future proof compare to amd cards. (getting higher resolution monitor and hope able to max out AA).
should I wait for nvidia next generation card that replace gtx970?
*
If you hadn't have a 970 or the mobo don't have good SLI support, I will always suggest single 980TI.

The best 970 SLI can scale is just slightly better than 980TI, while having 2.5GB less effective VRAM.

Price wise, they are pretty close together too.
cstkl1
post Jul 25 2015, 08:01 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(skylinelover @ Jul 25 2015, 06:33 AM)
Haha same here.

I dont buy the whatever sync crap.

Now i already itching 4k.

Just hoping the pascal flagship can 60fps in 4k in single card.

laugh.gif rclxms.gif
*
Gsync .. Vrr is not crap dude. U got to try it to understand it

Doubt pascal flagship can do that. Generall ure gonna need 2x performance of a Tx. Since each gen is about 40-50% gain.

So we are two gen away. Volta will be it.

Skylake interesting part is nvme. Only reason to upgrade.
Ddr4 hasnt matured to 8gb density sticks at 4800 c18-c21. No news on dp 1.3.
usb c currently in its infancy.

Z170 is bound to have more dmi issues with the pch. Z87/z97 has tons of issues already.

nettelim
post Jul 25 2015, 08:21 PM

New Member
*
Junior Member
6 posts

Joined: May 2006
» Click to show Spoiler - click again to hide... «



» Click to show Spoiler - click again to hide... «


2x 970 is best lane up in current time.
Not considering 980/980Ti because is it not price wise. RM3.2k~3.7k for one card is too much.
btw im getting USED card, it is hard to find 980ti in used market.

So it is not a good timing to get 2k/4k now?
one 970 and stick with 1080, wait for better tech to support?


SSJBen
post Jul 25 2015, 08:33 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(nettelim @ Jul 25 2015, 08:21 PM)
» Click to show Spoiler - click again to hide... «

» Click to show Spoiler - click again to hide... «


2x 970 is best lane up in current time.
Not considering 980/980Ti because is it not price wise. RM3.2k~3.7k for one card is too much.
btw im getting USED card, it is hard to find 980ti in used market.

So it is not a good timing to get 2k/4k now?
one 970 and stick with 1080, wait for better tech to support?
*
In Msia, no it's not a good time. What you are paying now is severely overpriced for non-legit reasons.
2x970 is literally the same price as an aftermarket 980Ti in the states, over in Msia..? Lol 2x aftermarket 970s is barely a single reference 980Ti.

Live in the US? It's a heck of a time to jump onto 980Ti + 1440p + Gsync setup.
clawhammer
post Jul 25 2015, 09:56 PM

///M
Group Icon
VIP
8,788 posts

Joined: Jan 2003
From: Kuala Lumpur




QUOTE(nettelim @ Jul 25 2015, 08:21 PM)
2x 970 is best lane up in current time.
Not considering 980/980Ti because is it not price wise. RM3.2k~3.7k for one card is too much.
btw im getting USED card, it is hard to find 980ti in used market.

So it is not a good timing to get 2k/4k now?
one 970 and stick with 1080, wait for better tech to support?
*
Sure, it's your choice and I'm just sharing my experiences biggrin.gif if cost is a concern then buying 2 units of lower models would be more suitable. Only if possible, always go with a single card solution.

If you want better gaming experience then go 1440p at least. 1080p is really a thing of the past. Today's card can easily support 1440p setups.
moron88
post Jul 25 2015, 10:27 PM

Getting Started
**
Junior Member
150 posts

Joined: Jun 2011


only premium ones are rm 3.2k to 3.7k, i got my zotac 980ti amp for just rm 2.8k from lowyat.

going to get a 1440p monitor next year. Gsync or not,, still thinking about it. all about the price. T.T
eugene88
post Jul 25 2015, 10:28 PM

Look at all my stars!!
*******
Senior Member
2,176 posts

Joined: Sep 2010


Jump ship from AMD to Nvidia, now a lot of games are crashing at launch
Used DDU to remove AMD and Nvidia drivers but doesn't fix the problem

Any suggestions?


SUSHuman10
post Jul 25 2015, 10:44 PM

Look at all my stars!!
*******
Senior Member
6,774 posts

Joined: Nov 2010
QUOTE(eugene88 @ Jul 25 2015, 10:28 PM)
Jump ship from AMD to Nvidia, now a lot of games are crashing at launch
Used DDU to remove AMD and Nvidia drivers but doesn't fix the problem

Any suggestions?
*
Games? Drivers version?

Honestly, Nvidia do release buggy drivers from time to time, install back older driver sometimes more stable and better performance.
eugene88
post Jul 25 2015, 10:52 PM

Look at all my stars!!
*******
Senior Member
2,176 posts

Joined: Sep 2010


QUOTE(Human10 @ Jul 25 2015, 10:44 PM)
Games? Drivers version?

Honestly, Nvidia do release buggy drivers from time to time, install back older driver sometimes more stable and better performance.
*
353.30
Sleeping Dogs, Assassin's Creed Unity, The Witcher 3, GTA V managed to run but whenever there's an event triggered it will crash the game

SUSHuman10
post Jul 25 2015, 10:54 PM

Look at all my stars!!
*******
Senior Member
6,774 posts

Joined: Nov 2010
QUOTE(eugene88 @ Jul 25 2015, 10:52 PM)
353.30
Sleeping Dogs, Assassin's Creed Unity, The Witcher 3, GTA V managed to run but whenever there's an event triggered it will crash the game
*
Card?

Cause 353.30 is pretty stable for me on GTA V.
eugene88
post Jul 25 2015, 11:00 PM

Look at all my stars!!
*******
Senior Member
2,176 posts

Joined: Sep 2010


QUOTE(Human10 @ Jul 25 2015, 10:54 PM)
Card?

Cause 353.30 is pretty stable for me on GTA V.
*
Asus 980 Strix
SUSHuman10
post Jul 25 2015, 11:05 PM

Look at all my stars!!
*******
Senior Member
6,774 posts

Joined: Nov 2010
QUOTE(eugene88 @ Jul 25 2015, 11:00 PM)
Asus 980 Strix
*
Hmm, seriously no idea, driver should be fine and the card had being around for such long time, hadn't heard such problem for the particular card.

The worst is try on a clean copy of windows + drivers.

This post has been edited by Human10: Jul 25 2015, 11:06 PM
eugene88
post Jul 25 2015, 11:08 PM

Look at all my stars!!
*******
Senior Member
2,176 posts

Joined: Sep 2010


QUOTE(Human10 @ Jul 25 2015, 11:05 PM)
Hmm, seriously no idea, driver should be fine and the card had being around for such long time, hadn't heard such problem for the particular card.

The worst is try on a clean copy of windows + drivers.
*
I'm gonna try the 353.49 driver
Kucci
post Jul 25 2015, 11:10 PM

Getting Started
**
Junior Member
221 posts

Joined: Jan 2010
QUOTE(moron88 @ Jul 25 2015, 10:27 PM)
only premium ones are rm 3.2k to 3.7k, i got my zotac 980ti amp for just rm 2.8k from lowyat.

going to get a 1440p monitor next year. Gsync or not,, still thinking about it. all about the price. T.T
*
since already pay 3k for a high-end card, should get premium for better cooling and oc
SUSHuman10
post Jul 25 2015, 11:10 PM

Look at all my stars!!
*******
Senior Member
6,774 posts

Joined: Nov 2010
QUOTE(eugene88 @ Jul 25 2015, 11:08 PM)
I'm gonna try the 353.49 driver
*
Do try a clean install.
TSskylinelover
post Jul 25 2015, 11:20 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
Driver sweeper works all the time laugh.gif rclxms.gif
eugene88
post Jul 25 2015, 11:45 PM

Look at all my stars!!
*******
Senior Member
2,176 posts

Joined: Sep 2010


QUOTE(Human10 @ Jul 25 2015, 11:10 PM)
Do try a clean install.
*
Tried, not working

QUOTE(skylinelover @ Jul 25 2015, 11:20 PM)
Driver sweeper works all the time laugh.gif rclxms.gif
*
Driver Sweeper has been replaced with Display Driver Uninstaller, according to Guru3D tongue.gif
SUScrash123
post Jul 26 2015, 03:18 AM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
I try to reaply thermal compound since i hear many company dont put thermal paste properly. Open my 980ti since there is no sticker on screw.

Wala. Zotac spread the thermal paste like shit. Use CM extrme fusion x1. And the result in so good. Before this i play gta 5 temp around 75-77 OC now it around 66-70. Hell even my idle temp goes from 40 to 37. Even time to reach max temp also increase. Totally worth it. rclxms.gif

This post has been edited by crash123: Jul 26 2015, 03:23 AM


Attached thumbnail(s)
Attached Image
clawhammer
post Jul 26 2015, 03:56 AM

///M
Group Icon
VIP
8,788 posts

Joined: Jan 2003
From: Kuala Lumpur




QUOTE(crash123 @ Jul 26 2015, 03:18 AM)
I try to reaply thermal compound since i hear many company dont put thermal paste properly.


Yes, it's normal and most stock compound are horrible smile.gif
Moogle Stiltzkin
post Jul 26 2015, 01:48 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE
If these alleged rumors hold true Pascal is looking like its going to be a massive leap in performance compared to leap in performance. If the full fat GP100 chip is going to have 17 billion transistors then that is over double the amount compared to GM200 and over 3 times the amount compared to GM204. Also 32 GB VRAM!? I can't wait to see what kind of setup you're going to need to even come close using up that much frame buffer. Also Pascal will be utilizing TSMC's all new 16nm FinFet + technology allowing for 65% higher speed, 2 times the density and 70% less power than that of 28HPM which is Maxwells production process. So expect to see even lower power usage, ultra low temperatures and cards finally breaking the 2GHZ barrier on air. Personally I was on the fence with upgrading to two 980 Tis from my 980s but I think Im just going to sit and wait a bit longer!
What do you guys think?


http://www.fudzilla.com/news/graphics/3830...ion-transistors

http://wccftech.com/nvidia-pascal-gpu-17-b...rrives-in-2016/



QUOTE
With Pascal GPU, NVIDIA will return to the HPC market with new Tesla products. Maxwell, although great in all regards was deprived of necessary FP64 hardware and focused only on FP32 performance. This meant that the chip was going to stay away from HPC markets while NVIDIA offered their year old Kepler based cards as the only Tesla based options. Pascal will not only improve FP64 performance but also feature mixed precision that allows NVIDIA cards to compute at 16-bit at double the accuracy of FP32. This means that the cards will enable three tiers of compute at FP16, FP32 and FP64. NVIDIA’s far future Volta GPU will further leverage the compute architecture as it is already planned to be part of the SUMMIT and Sierra super computers that feature over 150 PetaFlops of compute performance and launch in 2017 which indicates the launch of Volta just a year after Pascal for the HPC market.

http://wccftech.com/nvidia-volta-gpus-ibm-...supercomputers/



$_$; pascal is the one to wait for.

though i'm wondering what exactly is the launch date for volta. last time they said a year after pascal. then it was bumped to 2. now their saying 2017 again... lel....



so basically... the fiji mostly played catch up. didn't really outright defeat the 980ti. but they did bring in hbm (to cope with the curve of performance, this was a necessary step at some point). but that was about it. also they capped at 4gb vram due to hbm1 technology constraints .... less than the 6gb vram on a 980ti. but even worse was the price/performance they matched exactly to the 980ti which i felt was a poor decision choice, cause it was more likely people would buy a 980ti because of that.

if the rumors are true, pascal is not only adding hbm, their also going to add a huge jump in performance. and if that wasn't enough, we thought it would be 8gb vram, but now it could possible be even up to 16-32gb ......

even i think 16+ seems a bit unnecessary at this point. i'm just fine with 8gb for future proofing. or do 4k gaming ultra textures really need that much ? it would be interesting to see when those reviews come out smile.gif


now question is price ... hmm.gif



QUOTE(goldfries @ Jul 25 2015, 12:21 AM)
I tested FreeSync monitor, and I couldn't product screen tearing. tongue.gif
*
did you test regarding any issues when the fps dips below the minimum vrr ? how does that affect gaming, and how noticeable was it hmm.gif i'm interested to know.


This post has been edited by Moogle Stiltzkin: Jul 26 2015, 01:56 PM
Minecrafter
post Jul 26 2015, 02:35 PM

ROCK N ROLL STAR
*******
Senior Member
5,043 posts

Joined: Aug 2013
From: Putrajaya


QUOTE(Moogle Stiltzkin @ Jul 26 2015, 01:48 PM)
http://www.fudzilla.com/news/graphics/3830...ion-transistors

http://wccftech.com/nvidia-pascal-gpu-17-b...rrives-in-2016/
http://wccftech.com/nvidia-volta-gpus-ibm-...supercomputers/
$_$; pascal is the one to wait for.

though i'm wondering what exactly is the launch date for volta. last time they said a year after pascal. then it was bumped to 2. now their saying 2017 again... lel....
so basically... the fiji mostly played catch up. didn't really outright defeat the 980ti. but they did bring in hbm (to cope with the curve of performance, this was a necessary step at some point). but that was about it. also they capped at 4gb vram due to hbm1 technology constraints .... less than the 6gb vram on a 980ti. but even worse was the price/performance they matched exactly to the 980ti which i felt was a poor decision choice, cause it was more likely people would buy a 980ti because of that.

if the rumors are true, pascal is not only adding hbm, their also going to add a huge jump in performance. and if that wasn't enough, we thought it would be 8gb vram, but now it could possible be even up to 16-32gb ......

even i think 16+ seems a bit unnecessary at this point. i'm just fine with 8gb for future proofing.  or do 4k gaming ultra textures really need that much ? it would be interesting to see when those reviews come out  smile.gif
now question is price ...  hmm.gif
did you test regarding any issues when the fps dips below the minimum vrr ? how does that affect gaming, and how noticeable was it  hmm.gif  i'm interested to know.
*
Hori crap. shocking.gif I'm sure skylinelover will like this. tongue.gif biggrin.gif

Well,the card might run out of processing power even before reaching its max vRAM capacity,but since "add a huge jump in performance",you'll never know. hmm.gif

This post has been edited by Minecrafter: Jul 26 2015, 02:37 PM
Moogle Stiltzkin
post Jul 26 2015, 02:58 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(Minecrafter @ Jul 26 2015, 02:35 PM)
Hori crap. shocking.gif I'm sure skylinelover will like this. tongue.gif  biggrin.gif

Well,the card might run out of processing power even before reaching its max vRAM capacity,but since "add a huge jump in performance",you'll never know. hmm.gif
*
i'm very skeptical it will be 32gb.... maybe in a titan x tier level sort of product perhaps. if i had to guess it would be between 8-16gb. and even 16gb is crazy. 8gb is pretty sweet doh.gif

double transistor count (compared to a titan x and a fiji).... drool.gif

hopefully the prices aren't above the opening price of a 980ti :/ can only hope.
SSJBen
post Jul 26 2015, 03:17 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


32GB is for their Quadro cards.

8GB is pretty much a given for the next x70/x80 cards, whatever Nvidia is going to call them (1080 would be hilarious).
Question now is, if Hynix has enough stock left for Nvidia or not since AMD will get he major bulk of HBM2 in Q1 2016.

Obviously things can change very quickly, it's business after all.
Moogle Stiltzkin
post Jul 26 2015, 03:20 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(SSJBen @ Jul 26 2015, 03:17 PM)
32GB is for their Quadro cards.
okay that makes more sense.



QUOTE
8GB is pretty much a given for the next x70/x80 cards, whatever Nvidia is going to call them (1080 would be hilarious).
Question now is, if Hynix has enough stock left for Nvidia or not since AMD will get he major bulk of HBM2 in Q1 2016.
will be interesting to see how things pan out here. worse case scenario, limited stock = higher price = i'm screwed rclxub.gif


QUOTE
Obviously things can change very quickly, it's business after all.
actually there was some chatter on very subject. like nvidia had to come out with something good rather than rest on it laurels. because amd can come out with something after fiji straight after. i think there was mention they were aiming for this though i'm not fully sure.

QUOTE
Except that HBM2 is a drop-in replacement for HBM1 on AMD's Fury parts, so even before Greenland arrives, AMD could get a Fury-X revision to market with 8, or more gigs of HBM memory. The interposer work is done for AMD and its the bottom chip in the HBM stack that hosts the control logic for the HBM die stacks above, and the interposer memory traces for HBM2 are not increasing. A Fury-XA revision may be available just as the HBM2 memory stacks arrive hot off that final assembly line. That and some Tweaks could put Fury over the Ti's performance metrics, and maybe with process refinements and some more overclocking on any Fury-X revisions. AMD has exclusivity on that lines process, and its HBM, Nvidia has got to take the available to all standards and make a line of its own, outside of AMD's line that has AMD's name on it!
http://www.pcper.com/news/Graphics-Cards/R...B-HBM2#comments



i think the problem with fiji was, hbm by itself wasn't going to be a game changer where it mattered the most. or the benefits wouldn't be significantly transparent to the user (e.g. huge leaps in fps gains... for example)

if the pascal rumors are true, then they would have not only intro'd hbm, but also added other stuff that would indeed increase performance significantly that would have the wow factor. just pray it's true smile.gif


so the size is it still on track to be similar to the fiji ? here's the pic shown
user posted image


This post has been edited by Moogle Stiltzkin: Jul 26 2015, 03:30 PM
Minecrafter
post Jul 26 2015, 03:27 PM

ROCK N ROLL STAR
*******
Senior Member
5,043 posts

Joined: Aug 2013
From: Putrajaya


Reset the xxx numbers? biggrin.gif Let's start off with the GTX970 replacement,GTX007 James Bond edition. brows.gif
billytong
post Jul 26 2015, 04:14 PM

Lord Sauron
*******
Senior Member
4,522 posts

Joined: Jan 2003
From: Mordor, Middle Earth.


QUOTE(eugene88 @ Jul 25 2015, 10:28 PM)
Jump ship from AMD to Nvidia, now a lot of games are crashing at launch
Used DDU to remove AMD and Nvidia drivers but doesn't fix the problem

Any suggestions?
*

last I heard 347.88 is the last best driver. The recent batch is all went crazy.

Anyway, I just heard some news about GTX950 coming out in Aug 17th, 750/750ti is getting price cut to make room for 950.

U guys have any idea how long would our local retailer to reflect the new 750/750ti pricing? hmm.gif might need to get one soon.

marfccy
post Jul 26 2015, 04:22 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(Moogle Stiltzkin @ Jul 26 2015, 03:20 PM)
okay that makes more sense.
will be interesting to see how things pan out here. worse case scenario, limited stock = higher price = i'm screwed  rclxub.gif
actually there was some chatter on very subject. like nvidia had to come out with something good rather than rest on it laurels. because amd can come out with something after fiji straight after. i think there was mention they were aiming for this though i'm not fully sure.
http://www.pcper.com/news/Graphics-Cards/R...B-HBM2#comments
i think the problem with fiji was, hbm by itself wasn't going to be a game changer where it mattered the most. or the benefits wouldn't be significantly transparent to the user (e.g. huge leaps in fps gains... for example)

if the pascal rumors are true, then they would have not only intro'd hbm, but also added other stuff that would indeed increase performance significantly that would have the wow factor.  just pray it's true  smile.gif
so the size is it still on track to be similar to the fiji ? here's the pic shown
user posted image
*
its because like all new tech, nobody is there to make use of it yet

give it time, and soon devs will start to utilise the high bandwidth capability, making the current ones near obsolete
SheepMekk
post Jul 26 2015, 04:55 PM

Casual
***
Junior Member
314 posts

Joined: Jun 2008
Hey all, any of you guys with 980ti have any idea of your fan speeds? Mine doesn't even turn on at 60C shocking.gif regulating fan speeds with afterburner at the mean time.
eugene88
post Jul 26 2015, 05:07 PM

Look at all my stars!!
*******
Senior Member
2,176 posts

Joined: Sep 2010


QUOTE(billytong @ Jul 26 2015, 04:14 PM)
last I heard 347.88 is the last best driver. The recent batch is all went crazy.

Anyway, I just heard some news about GTX950 coming out in Aug 17th, 750/750ti is getting price cut to make room for 950.

U guys have any idea how long would our local retailer to reflect the new 750/750ti pricing?  hmm.gif might need to get one soon.
*
I found out what caused the problem, it's from Duet Display
SUSHuman10
post Jul 26 2015, 06:10 PM

Look at all my stars!!
*******
Senior Member
6,774 posts

Joined: Nov 2010
QUOTE(SheepMekk @ Jul 26 2015, 04:55 PM)
Hey all, any of you guys with 980ti have any idea of your fan speeds? Mine doesn't even turn on at 60C shocking.gif regulating fan speeds with afterburner at the mean time.
*
The idle fanless mode had being around with some models of GPU by Asus, MSI, EVGA and may be more since the release of 970/980.

Nothing much to worry about, its just manufacturers are confident with the passive cooling of the heatsink and tweaked the VBios to so. You can manually flash a custom vbios to fix your own fan profile. But again afterburner is the safer mode since they only deal with software instead of bios.
marfccy
post Jul 26 2015, 07:21 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(Human10 @ Jul 26 2015, 06:10 PM)
The idle fanless mode had being around with some models of GPU by Asus, MSI, EVGA and may be more since the release of 970/980.

Nothing much to worry about, its just manufacturers are confident with the passive cooling of the heatsink and tweaked the VBios to so. You can manually flash a custom vbios to fix your own fan profile. But again afterburner is the safer mode since they only deal with software instead of bios.
*
i dont think he needs to do so right?

normally you can just adjust as per your liking to when the fan will start spinning (lowering the temp threshold)
TSskylinelover
post Jul 26 2015, 07:46 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(Moogle Stiltzkin @ Jul 26 2015, 03:20 PM)
so the size is it still on track to be similar to the fiji ? here's the pic shown
user posted image
*
that is the end of the days of using long ass big ass card laugh.gif

guess i selling off my huge casing 4 the new drawer casing then doh.gif
vaizard
post Jul 26 2015, 07:57 PM

Casual
***
Junior Member
304 posts

Joined: Sep 2008


hello all.
Attached Image
anyone encounter this problem before?Im keep facing this problem when im gaming. Really help your help. TQIA

This post has been edited by vaizard: Jul 26 2015, 07:58 PM
Moogle Stiltzkin
post Jul 26 2015, 08:10 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(skylinelover @ Jul 26 2015, 07:46 PM)
that is the end of the days of using long ass big ass card laugh.gif

guess i selling off my huge casing 4 the new drawer casing then doh.gif
*
will definitely be more room for the cabling :}

since i use fancy water cooling radiator, i have to stick with my full atx pc aluminium casing :}

but it would be interesting to see other peoples rigs how small it can be to now fit the new card smile.gif
Moogle Stiltzkin
post Jul 26 2015, 09:03 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
noticed the pascal has mixed precision fp16/32/64


so was reading on what exactly it has to do with gaming :/

QUOTE
You can deduce the difference between double precision floating point (FP64) and single precision floating point (FP32) from the name. FP64 results are significantly more precise than FP32. This added precision in the results is crucial for scientific research, professional applications and servers. And less so in video games. Even though FP64 is used in games in a very limited subset of functions, the bulk of video game and graphics code relies on FP32. As such this added precision in turn requires more capable hardware which would net higher costs by increasing the size of the chip while simultaneously increasing power consumption.


bottomline
QUOTE
So, since the GTX Titan Black has a peak of 5.1 TFLOPS single precision floating point performance, a 3:1 ratio means that double precision compute goes down to 1.7 TFLOPs. And with AMD’s Hawaii XT which has a peak of 5.6 TFLOPs of FP32 compute performance, a 2:1 ratio means that it will go down to a more respectable 2.8 TFLOPs of FP64 compute performance. This advantage in FP64 compute is why AMD succeeded in capturing the top spot in the Green500 list of the world’s most power efficient supercomputers with it’s Hawaii XT powered FirePro S9150 server graphics cards.

The FP32 to FP64 ratio in Nvidia’s GM204 and GM206 Maxwell GPUs, powering the GTX 980, 970 and 960 is 32:1. Which means the GPU will be 32 times slower when dealing with FP64 intensive operations compared to FP32. As we’ve discussed above this is mostly OK for video games but downright unacceptable for professional applications.

If Nvidia’s GM200 does end up with a similarly weak double precision compute capablity the card will have very limited uses in the professional market. However in theory the reduction of FP64 hardware resources on the chip should make it more power efficient in games and FP32 compute work. Even though I’m not entirely convinced that it’s a worthwhile trade off. Especially for a card that is poised to go into the next generation Qaudro flagship compute cards.


http://wccftech.com/nvidia-gm200-gpu-fp64-performance/



anyway with pascal will have improved fp64 as well as mixed precision.
QUOTE
With Pascal GPU, NVIDIA will return to the HPC market with new Tesla products. Maxwell, although great in all regards was deprived of necessary FP64 hardware and focused only on FP32 performance. This meant that the chip was going to stay away from HPC markets while NVIDIA offered their year old Kepler based cards as the only Tesla based options. Pascal will not only improve FP64 performance but also feature mixed precision that allows NVIDIA cards to compute at 16-bit at double the accuracy of FP32. This means that the cards will enable three tiers of compute at FP16, FP32 and FP64. NVIDIA’s far future Volta GPU will further leverage the compute architecture as it is already planned to be part of the SUMMIT and Sierra super computers that feature over 150 PetaFlops of compute performance and launch in 2017 which indicates the launch of Volta just a year after Pascal for the HPC market.

http://wccftech.com/nvidia-pascal-gpu-17-b...rrives-in-2016/

This post has been edited by Moogle Stiltzkin: Jul 26 2015, 09:10 PM
SSJBen
post Jul 26 2015, 09:14 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Moogle Stiltzkin @ Jul 26 2015, 09:03 PM)
noticed the pascal has mixed precision fp16/32/64
so was reading on what exactly it has to do with gaming :/
bottomline
http://wccftech.com/nvidia-gm200-gpu-fp64-performance/
anyway with pascal will have improved fp64 as well as mixed precision.

http://wccftech.com/nvidia-pascal-gpu-17-b...rrives-in-2016/
*
One thing bro, don't quote wccftech too much. Their articles are all opinions (yet they think it's facts).
Minecrafter
post Jul 26 2015, 10:05 PM

ROCK N ROLL STAR
*******
Senior Member
5,043 posts

Joined: Aug 2013
From: Putrajaya


QUOTE(skylinelover @ Jul 26 2015, 07:46 PM)
that is the end of the days of using long ass big ass card laugh.gif

guess i selling off my huge casing 4 the new drawer casing then doh.gif
*
TBH,i prefer a long card,like the Sapphire's Tri-X/3 fan Vapor-X/Toxic,MSI's Lightning etc.
SheepMekk
post Jul 26 2015, 11:55 PM

Casual
***
Junior Member
314 posts

Joined: Jun 2008
QUOTE(Human10 @ Jul 26 2015, 06:10 PM)
The idle fanless mode had being around with some models of GPU by Asus, MSI, EVGA and may be more since the release of 970/980.

Nothing much to worry about, its just manufacturers are confident with the passive cooling of the heatsink and tweaked the VBios to so. You can manually flash a custom vbios to fix your own fan profile. But again afterburner is the safer mode since they only deal with software instead of bios.
*
Ahh I see. A little OCD with the idling and running temperatures so I'll just make them run laugh.gif even tried underclocking/downclocking

Although there are random spikes (to 100%) causing the driver to malfunction for a few seconds even using version 353.06 which is said to be stable in reddit.
Moogle Stiltzkin
post Jul 27 2015, 06:05 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(SSJBen @ Jul 26 2015, 09:14 PM)
One thing bro, don't quote wccftech too much. Their articles are all opinions (yet they think it's facts).
*
will the new doom game be out by the time pascal arrives hmm.gif

cause thats the game i want to be eye candy pimping on with the pascal drool.gif

also theres star citizen as well.
terradrive
post Jul 27 2015, 09:28 AM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


QUOTE(SSJBen @ Jul 26 2015, 09:14 PM)
One thing bro, don't quote wccftech too much. Their articles are all opinions (yet they think it's facts).
*
Yes, got one they claimed Fury Nano benchmarked but the article inside shows it's calculated doh.gif
SSJBen
post Jul 27 2015, 02:33 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Moogle Stiltzkin @ Jul 27 2015, 06:05 AM)
will the new doom game be out by the time pascal arrives  hmm.gif

cause thats the game i want to be eye candy pimping on with the pascal  drool.gif

also theres star citizen as well.
*
I believe it would be a late Q2 or mid-Q3 2016 release, just my estimated guess following Bethesda's fiscal release. They have their Q1 covered with Fallout 4 already.

If all goes well and Nvidia stays on track, Pascal should come out by Q3 2016. There are rumors circulating that Nvidia will release big Pascal first, as oppose to what they did with Kepler and Maxwell. Just a rumor though, depends on the market as always.
Moogle Stiltzkin
post Jul 27 2015, 03:05 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(SSJBen @ Jul 27 2015, 02:33 PM)
I believe it  would be a late Q2 or mid-Q3 2016 release, just my estimated guess following Bethesda's fiscal release. They have their Q1 covered with Fallout 4 already.

If all goes well and Nvidia stays on track, Pascal should come out by Q3 2016. There are rumors circulating that Nvidia will release big Pascal first, as oppose to what they did with Kepler and Maxwell. Just a rumor though, depends on the market as always.
what about this doubling of transistors. i heard something about most of that is mostly going to be for hpc compute or something rather than mostly gaming performance, so their performance estimate for gaming was somehwere between 50-60% vs titanx. any ideas :/ ?



QUOTE(SSJBen @ Jul 27 2015, 02:33 PM)
If all goes well and Nvidia stays on track, Pascal should come out by Q3 2016. There are rumors circulating that Nvidia will release big Pascal first, as oppose to what they did with Kepler and Maxwell. Just a rumor though, depends on the market as always.
do you mean their highend model will come out first ? that suits me fine. but i rather avoid a titan x model, and rather opt for a 980ti equivalent :/ i rather save money when possible xd.

This post has been edited by Moogle Stiltzkin: Jul 27 2015, 03:09 PM
SSJBen
post Jul 27 2015, 03:23 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(terradrive @ Jul 27 2015, 09:28 AM)
Yes, got one they claimed Fury Nano benchmarked but the article inside shows it's calculated  doh.gif
*
Yup.
And remember all the claims they made about Kepler before launch... lol, many of which is untrue other than the "state-the-obvious" remarks. doh.gif


QUOTE(Moogle Stiltzkin @ Jul 27 2015, 03:05 PM)
what about this doubling of transistors. i heard something about most of that is mostly going to be for hpc compute or something rather than mostly gaming performance, so their performance estimate for gaming was somehwere between 50-60% vs titanx. any ideas :/ ?
do you mean their highend model will come out first ? that suits me fine. but i rather avoid a titan x model, and rather opt for a 980ti equivalent :/ i rather save money when possible xd.
*
It makes sense, 50-60% is quite similar to that of Kepler from Fermi and Maxwel from Kepler. HBM while interesting, I don't think we will see most of its potential until 2017, when DX12 and Vulkan is much more matured. NVLink apparently will be focused for supercomputers only, not sure if it will make it to the consumer grade cards or not? There's no confirmation on this.

Yeah, there is a rumor circulating around that PK100 will make the scenes first, instead of PK104. I honestly... doubt it? laugh.gif
Moogle Stiltzkin
post Jul 27 2015, 04:01 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(SSJBen @ Jul 27 2015, 03:23 PM)
Yup.
And remember all the claims they made about Kepler before launch... lol, many of which is untrue other than the "state-the-obvious" remarks. doh.gif
It makes sense, 50-60% is quite similar to that of Kepler from Fermi and Maxwel from Kepler. HBM while interesting, I don't think we will see most of its potential until 2017, when DX12 and Vulkan is much more matured. NVLink apparently will be focused for supercomputers only, not sure if it will make it to the consumer grade cards or not? There's no confirmation on this.

Yeah, there is a rumor circulating around that PK100 will make the scenes first, instead of PK104. I honestly... doubt it? laugh.gif
*
ooo i'll google that up then.

then this nvlink ? sounds like not only do i buy the pascal gpu, but also need a new motherboard with nvlink as well ?

they say it would look this basically
user posted image

QUOTE
Coming to the final pillar then, we have a brand new feature being introduced for Pascal: NVLink. NVLink, in a nutshell, is NVIDIA’s effort to supplant PCI-Express with a faster interconnect bus. From the perspective of NVIDIA, who is looking at what it would take to allow compute workloads to better scale across multiple GPUs, the 16GB/sec made available by PCI-Express 3.0 is hardly adequate. Especially when compared to the 250GB/sec+ of memory bandwidth available within a single card. PCIe 4.0 in turn will eventually bring higher bandwidth yet, but this still is not enough. As such NVIDIA is pursuing their own bus to achieve the kind of bandwidth they desire.

The end result is a bus that looks a whole heck of a lot like PCIe, and is even programmed like PCIe, but operates with tighter requirements and a true point-to-point design. NVLink uses differential signaling (like PCIe), with the smallest unit of connectivity being a “block.” A block contains 8 lanes, each rated for 20Gbps, for a combined bandwidth of 20GB/sec. In terms of transfers per second this puts NVLink at roughly 20 gigatransfers/second, as compared to an already staggering 8GT/sec for PCIe 3.0, indicating at just how high a frequency this bus is planned to run at.


user posted image

QUOTE
Multiple blocks in turn can be teamed together to provide additional bandwidth between two devices, or those blocks can be used to connect to additional devices, with the number of bricks depending on the SKU. The actual bus is purely point-to-point – no root complex has been discussed – so we’d be looking at processors directly wired to each other instead of going through a discrete PCIe switch or the root complex built into a CPU. This makes NVLink very similar to AMD’s Hypertransport, or Intel’s Quick Path Interconnect (QPI). This includes the NUMA aspects of not necessarily having every processor connected to every other processor.

But the rabbit hole goes deeper. To pull off the kind of transfer rates NVIDIA wants to accomplish, the traditional PCI/PCIe style edge connector is no good; if nothing else the lengths that can be supported by such a fast bus are too short. So NVLink will be ditching the slot in favor of what NVIDIA is labeling a mezzanine connector, the type of connector typically used to sandwich multiple PCBs together (think GTX 295). We haven’t seen the connector yet, but it goes without saying that this requires a major change in motherboard designs for the boards that will support NVLink. The upside of this however is that with this change and the use of a true point-to-point bus, what NVIDIA is proposing is for all practical purposes a socketed GPU, just with the memory and power delivery circuitry on the GPU instead of on the motherboard.



user posted image
Molex's NeoScale: An example of a modern, high bandwidth mezzanine connector


QUOTE
DIA is touting is that the new connector and bus will improve both energy efficiency and energy delivery. When it comes to energy efficiency NVIDIA is telling us that per byte, NVLink will be more efficient than PCIe – this being a legitimate concern when scaling up to many GPUs. At the same time the connector will be designed to provide far more than the 75W PCIe is spec’d for today, allowing the GPU to be directly powered via the connector, as opposed to requiring external PCIe power cables that clutter up designs.

With all of that said, while NVIDIA has grand plans for NVLink, it’s also clear that PCIe isn’t going to be completely replaced anytime soon on a large scale. NVIDIA will still support PCIe – in fact the blocks can talk PCIe or NVLink – and even in NVLink setups there are certain command and control communiques that must be sent through PCIe rather than NVLink. In other words, PCIe will still be supported across NVIDIA's product lines, with NVLink existing as a high performance alternative for the appropriate product lines. The best case scenario for NVLink right now is that it takes hold in servers, while workstations and consumers would continue to use PCIe as they do today.


Too much to quote, the rest is here
http://www.anandtech.com/show/7900/nvidia-...ecture-for-2016



anyway sounds like nvlink mobo isn't a pre-requisite to use a pascal, can still use pcie. but the question, would using nvlink for a single gpu be worth it ? or is it only going to help for multi gpu setups ? I'm not a fan of multi gpus cause of driver support issues doh.gif so just wondering if upgrading to nvlink mobo is worth it for a single gpu setup. i rather wait for a cannonlake + before i upgrade sweat.gif


This post has been edited by Moogle Stiltzkin: Jul 27 2015, 04:03 PM
arslow
post Jul 27 2015, 10:14 PM

Look at all my stars!!
*******
Senior Member
3,544 posts

Joined: Sep 2008


QUOTE(cstkl1 @ Jul 25 2015, 08:01 PM)
Gsync .. Vrr is not crap dude. U got to try it to understand it

Doubt pascal flagship can do that. Generall ure gonna need 2x performance of a Tx. Since each gen is about 40-50% gain.

So we are two gen away. Volta will be it.

Skylake interesting part is nvme. Only reason to upgrade.
Ddr4 hasnt matured to 8gb density sticks at 4800 c18-c21. No news on dp 1.3.
usb c currently in its infancy.

Z170 is bound to have more dmi issues with the pch. Z87/z97 has tons of issues already.
*
Ugh, everyday I'm getting less and less interested in replacing my 2500k with a skylake platform. I guess what I'm gonna do with my rig overhaul budget(about 4k or so) is just get a new case and go all out on gpu...maybe get a 1080ti or whatever they decide to name it lol.

My u2412m is only 3 years old now. Have promised myself to not change it till the warranty is over, but it's getting harder and harder to do so, whatnot with the existence of 27" ips 144hz WQHD monitors these days!!!!
cstkl1
post Jul 27 2015, 10:27 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(arslow @ Jul 27 2015, 10:14 PM)
Ugh, everyday I'm getting less and less interested in replacing my 2500k with a skylake platform.  I guess what I'm gonna do with my rig overhaul budget(about 4k or so) is just get a new case and go all out on gpu...maybe get a 1080ti or whatever they decide to name it lol.

My u2412m is only 3 years old now. Have promised myself to not change it till the warranty is over, but it's getting harder and harder to do so, whatnot with the existence of 27" ips 144hz WQHD monitors these days!!!!
*
haswell benefit

if u want more
faster encoding via the huge dmi bandwidth.
sata 3 native ports
usb 3 ports

skylake benefits
NVME ssd
native usb 3.1 ports ( not sure just assuming at this point)

skylake still deciding should i go nvme. definately no obvious gain other then watching ssd benchmarks.
zero benefit on gaming to be honest.

usb 3.1 seriously is there any device out there thats taking advantage of its direct access to dmi gen 1 speeds??

nvlink its not something ure gonna see beneficial on daily commercial gaming setups.
it will how ever open up to a possible octa titan pascal. also with such tech i am pretty sure nvidia has solve some multi gpu scaling here.
also interesting part about nvlink is .. the ability of gpu to access ram and vram of the other gpu directly if i am not mistaken.
the tech is gonna solve current issues with multi gpu.

This post has been edited by cstkl1: Jul 27 2015, 10:30 PM
shikimori
post Jul 28 2015, 12:05 AM

Getting Started
**
Junior Member
236 posts

Joined: Jul 2007
From: Penang


QUOTE(arslow @ Jul 27 2015, 10:14 PM)
Ugh, everyday I'm getting less and less interested in replacing my 2500k with a skylake platform.  I guess what I'm gonna do with my rig overhaul budget(about 4k or so) is just get a new case and go all out on gpu...maybe get a 1080ti or whatever they decide to name it lol.

My u2412m is only 3 years old now. Have promised myself to not change it till the warranty is over, but it's getting harder and harder to do so, whatnot with the existence of 27" ips 144hz WQHD monitors these days!!!!
*
go get yourself a 144hz IPS monitor . You wont regret it man , game that you play feels really buttery smooth without sacrificing colors and viewing angle . I have not experience g-sync or freesync but 144hz is really worth it provided you have a decent gpu

Think of it like having the COD esque movement feeling when playing non COD games even on RTS or games like Diablo 3 .
TSskylinelover
post Jul 28 2015, 12:12 AM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(Moogle Stiltzkin @ Jul 27 2015, 04:01 PM)
ooo i'll google that up then.

then this nvlink ? sounds like not only do i buy the pascal gpu, but also need a new motherboard with nvlink as well ?

they say it would look this basically
user posted image
user posted image
user posted image
Molex's NeoScale: An example of a modern, high bandwidth mezzanine connector
Too much to quote, the rest is here
http://www.anandtech.com/show/7900/nvidia-...ecture-for-2016
anyway sounds like nvlink mobo isn't a pre-requisite to use a pascal, can still use pcie. but the question, would using nvlink for a single gpu be worth it ? or is it only going to help for multi gpu setups ? I'm not a fan of multi gpus cause of driver support issues doh.gif so just wondering if upgrading to nvlink mobo is worth it for a single gpu setup. i rather wait for a cannonlake + before i upgrade  sweat.gif
*
Cannonlake wait till hair drop wont come out so soon laugh.gif doh.gif pulled the trigger from lynfield 2 haswell last february with no regret haha

As 4 pascal, i still can hold out my kepler till Q3 next year if everything went according 2 plan. Being 30s means i no longer able 2 buy gpu every year unlike student days using sugar daddy money. Haha.

P/s : having 1500 per month used 2 be heaven like being student but now salary double that not enough 2 cover myself doh.gif probably need quad only enough and dont start talking about after having children with me shakehead.gif

That is unless i live malay way of 5 children with low grade milk powder hahahaha i know my 15 yrs longest serving malay supervisor ever with slightly higher salary than me already working in 2 years can support his 5 kids comfortably and maybe with some help from the G perhaps aiks

This post has been edited by skylinelover: Jul 28 2015, 12:24 AM
Moogle Stiltzkin
post Jul 28 2015, 01:15 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Jul 27 2015, 10:27 PM)
haswell benefit

if u want more
faster encoding via the huge dmi bandwidth.
sata 3 native ports
usb 3 ports

skylake benefits
NVME ssd
native usb 3.1 ports ( not sure just assuming at this point)

skylake still deciding should i go nvme. definately no obvious gain other then watching ssd benchmarks.
zero benefit on gaming to be honest.

usb 3.1 seriously is there any device out there thats taking advantage of its direct access to dmi gen 1 speeds??

nvlink  its not something ure gonna see beneficial on daily commercial gaming setups.
it will how ever open up to a possible octa titan pascal. also with such tech i am pretty sure nvidia has solve some multi gpu scaling here.
also interesting part about nvlink is .. the ability of gpu to access ram and vram of the other gpu directly if i am not mistaken.
the tech is gonna solve current issues with multi gpu.
*
what is the performance improvement of skylake over ivy bridge ? hmm.gif do i need an upgrade to skylake to benefit for pascal. if not for compute performance, then for at least a nvlink, but would that make any difference for single gpu setups, or would pcie3 be enough ?



QUOTE(shikimori @ Jul 28 2015, 12:05 AM)
go get yourself a 144hz IPS monitor . You wont regret it man , game that you play feels really buttery smooth without sacrificing colors and viewing angle . I have not experience g-sync or freesync but 144hz is really worth it  provided you have a decent gpu

Think of it like having the COD esque movement feeling  when playing non COD games even on RTS or games like Diablo 3 .
*
the thing is, there are a few technologies coming not long after. things i can think of is quantum dot which they would add as a film layer onto monitors for improved color. and who knows, maybe the led back light too would have an upgrade from the now common w-led to something like a gr-r led or better hmm.gif for even better colors. but if current ips w-led technology is more than sufficient, then yeah the acer predator 144hz gsync ips monitor seems to be the best gaming monitor out atm. the only thing i can critique is the monitor build which has a light reflective plastic frame. but still considering everything else passes scrutiny over at tft central, the few flaws seem worth glossing over smile.gif



QUOTE(skylinelover @ Jul 28 2015, 12:12 AM)
Cannonlake wait till hair drop wont come out so soon laugh.gif doh.gif pulled the trigger from lynfield 2 haswell last february with no regret haha

As 4 pascal, i still can hold out my kepler till Q3 next year if everything went according 2 plan. Being 30s means i no longer able 2 buy gpu every year unlike student days using sugar daddy money. Haha.

P/s : having 1500 per month used 2 be heaven like being student but now salary double that not enough 2 cover myself doh.gif probably need quad only enough and dont start talking about after having children with me shakehead.gif

That is unless i live malay way of 5 children with low grade milk powder hahahaha i know my 15 yrs longest serving malay supervisor ever with slightly higher salary than me already working in 2 years can support his 5 kids comfortably and maybe with some help from the G perhaps aiks
*
well kepler was out in 2012, and pascal is out in 2016. so a 4 year upgrade seems due for me. or can possible delay by 1-2 year more for a volta (though i rather not do that). besides there just isn't enough info for what volta has to even warrant waiting. not to mention because volta was delayed, i suspect pascal more or less will be the base of what volta was, but is the product they will rush out first to cover the delay. so i'm placing my bet the performance difference won't be too huge from pascal and volta. besides theres always going to be something better. but i think pascal will be powerful enough to satisfy my pc gaming requirements for a long time

I think pascal is most likely to have a big performance upgrade to warrant upgrading in this point in juncture. i'm not the type of tech enthusiast who upgrades every next year, not rich enough for that rolleyes.gif

but i can tell you know that kepler 680gtx for me is not enough for current games. i tried ultra settings on dragon age inquistion and it was totally unplayable. had to lower quality settings to medium/high sad.gif

not playing at 144hz gsync (but instead my paltry 60ghz triple sync vsync mode) is one thing; but not being able to play at ultra settings on a 1080p resolution on a 24'' lcd ips screen is a bit too much for me to ignore doh.gif

This post has been edited by Moogle Stiltzkin: Jul 28 2015, 01:18 AM
arslow
post Jul 28 2015, 09:25 AM

Look at all my stars!!
*******
Senior Member
3,544 posts

Joined: Sep 2008


QUOTE(cstkl1 @ Jul 27 2015, 10:27 PM)
haswell benefit

if u want more
faster encoding via the huge dmi bandwidth.
sata 3 native ports
usb 3 ports

skylake benefits
NVME ssd
native usb 3.1 ports ( not sure just assuming at this point)

skylake still deciding should i go nvme. definately no obvious gain other then watching ssd benchmarks.
zero benefit on gaming to be honest.

usb 3.1 seriously is there any device out there thats taking advantage of its direct access to dmi gen 1 speeds??

nvlink  its not something ure gonna see beneficial on daily commercial gaming setups.
it will how ever open up to a possible octa titan pascal. also with such tech i am pretty sure nvidia has solve some multi gpu scaling here.
also interesting part about nvlink is .. the ability of gpu to access ram and vram of the other gpu directly if i am not mistaken.
the tech is gonna solve current issues with multi gpu.
*
Yeah, usb 3.1 is like the only reason I can see to upgrade to skylake lol. Basically would love to change just my motherboard and keep the cpu, but obviously Intel would never let that zzz.

Games are barely cpu bottlenecked these days, I really feel like the only real reason I would upgrade my cpu would be if my mobo or cpu dies....

Nvlink...not too crazy about it as I've never been interested with multi gpu.

Will definitely be upgrading to pascal though from my Kepler.
arslow
post Jul 28 2015, 09:28 AM

Look at all my stars!!
*******
Senior Member
3,544 posts

Joined: Sep 2008


QUOTE(shikimori @ Jul 28 2015, 12:05 AM)
go get yourself a 144hz IPS monitor . You wont regret it man , game that you play feels really buttery smooth without sacrificing colors and viewing angle . I have not experience g-sync or freesync but 144hz is really worth it  provided you have a decent gpu

Think of it like having the COD esque movement feeling  when playing non COD games even on RTS or games like Diablo 3 .
*
Would love to do that, but feel like squeezing as much as possible from the current monitor before moving on to something better.

And asus and acer aren't exactly the best brands in terms of QC. How I wish dell made a nice 27" IPS 144hz WQHD screen...
TSskylinelover
post Jul 28 2015, 10:00 AM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(Moogle Stiltzkin @ Jul 28 2015, 01:15 AM)
well kepler was out in 2012, and pascal is out in 2016. so a 4 year upgrade seems due for me. or can possible delay by 1-2 year more for a volta (though i rather not do that). besides there just isn't enough info for what volta has to even warrant waiting. not to mention because volta was delayed, i suspect pascal more or less will be the base of what volta was, but is the product they will rush out first to cover the delay. so i'm placing my bet the performance difference won't be too huge from pascal and volta. besides theres always going to be something better. but i think pascal will be powerful enough to satisfy my pc gaming requirements for a long time

I think pascal is most likely to have a big performance upgrade to warrant upgrading in this point in juncture. i'm not the type of tech enthusiast who upgrades every next year, not rich enough for that  rolleyes.gif

but i can tell you know that kepler 680gtx for me is not enough for current games. i tried ultra settings on dragon age inquistion and it was totally unplayable. had to lower quality settings to medium/high  sad.gif

not playing at 144hz gsync (but instead my paltry 60ghz triple sync vsync mode) is one thing; but not being able to play at ultra settings on a 1080p resolution on a 24'' lcd ips screen is a bit too much for me to ignore  doh.gif
*
Haha same boat here. I also was tempted with gtx680 last time but decided 2 skip over kepler rehash because just started my career. So cannot simply splash unlike students days. Now i target 3 year gap cycle instead of 2. Since i already hop in 1440p zone, its either high end or mid end SLI next year. Haha.


cstkl1
post Jul 28 2015, 10:06 AM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

Moogle Stiltzkin
Well depending task but generally clock to clock

haswell DC has 10% single thread gain
Skylake has around 5 percent to haswell.

so 15.5% faster. Also comparing ivybridge to haswell is unfair as it depends on the task. Things like avx/avx2 encoding .. haswell is about 20-30 percent faster.

Skylake real benefit is the 20 pcie lane. So u can run sli with nvme. Also now mobile gpus can have either sli or native 16x single gpu with m.2 pcie ssd.

The cons is ddr4 is too immature. Any platform that has ddr3/ddr4 dimm slots.. dont expect too much from it. Same like p55.
DDR4 atm too immature. Generally i hoping for 4800 CL18-21

but expect rams to evolve slower this time since most rams in the world nowadays is either micron, samsung or sk hynix.
The later two seems to be concentrating on die stacking.


This post has been edited by cstkl1: Jul 28 2015, 10:09 AM
Moogle Stiltzkin
post Jul 28 2015, 10:26 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Jul 28 2015, 10:06 AM)
Moogle Stiltzkin
Well depending task but generally clock to clock

haswell DC has 10% single thread gain
Skylake has around 5 percent to haswell.

so 15.5% faster. Also comparing ivybridge to haswell is unfair as it depends on the task. Things like avx/avx2 encoding .. haswell is about 20-30 percent faster.

Skylake real benefit is the 20 pcie lane. So u can run sli with nvme. Also now mobile gpus can have either sli or native 16x single gpu with m.2 pcie ssd.

The cons is ddr4 is too immature. Any platform that has ddr3/ddr4 dimm slots.. dont expect too much from it. Same like p55.
DDR4 atm too immature. Generally i hoping for 4800 CL18-21

but expect rams to evolve slower this time since most rams in the world nowadays is either micron, samsung or sk hynix.
The later two seems to be concentrating on die stacking.
*
i'm waiting for hmc but there is hardly any news when intel will add support to their chipsets and motherboards for this. ddr4 seems like something gonna die real soon :/ especially now that stuff like hbm is coming out.

though i doubt hbm will be used on motherboards because it was designed for graphics usage. hmc was designed for pc. but from what i heard hmc is not a jedec standard ? not sure why hmm.gif

so 15.5 % fps more ? from cpu hmm.gif

i guess will have to test out first what fps i get on pascal. i'll be using on a 1080p 1920x1200 reso 24'' i think thats more than enuff to hit my 60fps cap when using triple buffering vsync hmm.gif


by the way i preordered starcraft 2 legacy of the void. the graphics is not intensive at all .... lel... even my kepler can drive this game at ultra doh.gif fuggin blizzard.....

gotta put my hopes in new doom game to really push the gpu laugh.gif though not sure how the game will be like. think carmack left that company :[


As i suspected amd is gonna rush out their pirate gpu
http://wccftech.com/amd-r9-400-series-gpus...arctic-islands/


but there is already speculation out how things will be like then *change 970 to pascal
user posted image

laugh.gif


QUOTE
An update from sweclockers.com has revealed the code name of the next-next-gen AMD Radeon R9 400 Series GPUs. It goes without saying that the naming is pretty much irrelevant at such an early stage. The revealed codename for the series is Arctic Islands and the actual GPU name could be of any island present there. There is currently zero information regarding the Radeon R9 400 Series apart from the fact it will be based on a 20nm or lower node (most probably 16/14nm FinFET).


small nm fabrication is good but i doubt it will be enough. have they had time to develop a new architecture to be more power efficient and still have great performance ? cause i hadn't heard anything about that. Unlike volta and pascal we at least long ago heard they were working on a new architecture hmm.gif

This post has been edited by Moogle Stiltzkin: Jul 28 2015, 10:29 AM
TSskylinelover
post Jul 28 2015, 10:57 AM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(Moogle Stiltzkin @ Jul 28 2015, 10:26 AM)
i'm waiting for hmc but there is hardly any news when intel will add support to their chipsets and motherboards for this. ddr4 seems like something gonna die real soon :/ especially now that stuff like hbm is coming out.

though i doubt hbm will be used on motherboards because it was designed for graphics usage. hmc was designed for pc. but from what i heard hmc is not a jedec standard ? not sure why  hmm.gif

so 15.5 % fps more ? from cpu  hmm.gif

i guess will have to test out first what fps i get on pascal. i'll be using on a 1080p 1920x1200 reso 24'' i think thats more than enuff to hit my 60fps cap when using triple buffering vsync  hmm.gif
by the way i preordered starcraft 2 legacy of the void. the graphics is not intensive at all .... lel... even my kepler can drive this game at ultra  doh.gif  fuggin blizzard.....

gotta put my hopes in new doom game to really push the gpu  laugh.gif  though not sure how the game will be like. think carmack left that company :[
As i suspected amd is gonna rush out their pirate gpu
http://wccftech.com/amd-r9-400-series-gpus...arctic-islands/
but there is already speculation out how things will be like then *change 970 to pascal
user posted image

laugh.gif
small nm fabrication is good but i doubt it will be enough. have they had time to develop a new architecture to be more power efficient and still have great performance ? cause i hadn't heard anything about that. Unlike volta and pascal we at least long ago heard they were working on a new architecture  hmm.gif
*
Haha i more stoked with doom 3 than doom 4. Especially now with my heavy workload daily just 2 get double earnings from student allowance. I am less enthusiastic the doom 4 but if reviews are great, i will buy the game surely. I definitely miss unis days more since jumping in workforce world. Argh dang it.
takeshiru
post Jul 28 2015, 12:46 PM

Getting Started
**
Junior Member
91 posts

Joined: Jun 2010
Hello guys..

Hope to get some opinion here..currently using strix980.. And a 1080 monitor..
Selling my 980 off and upgrading to 980ti? Or get the acer predator?

Im leaning toward the monitor as my current one isnt great.. If i get the 980ti , could be a waste using it on a dell nom gsync monitor right?
arslow
post Jul 28 2015, 01:49 PM

Look at all my stars!!
*******
Senior Member
3,544 posts

Joined: Sep 2008


QUOTE(takeshiru @ Jul 28 2015, 12:46 PM)
Hello guys..

Hope to get some opinion here..currently using strix980.. And a 1080 monitor..
Selling my 980 off and upgrading to 980ti? Or get the acer predator?

Im leaning toward the monitor as my current one isnt great.. If i get the 980ti , could be a waste using it on a dell nom gsync monitor right?
*
I would upgrade the monitor rather than the gpu if I were you.
Moogle Stiltzkin
post Jul 28 2015, 02:30 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(takeshiru @ Jul 28 2015, 12:46 PM)
Hello guys..

Hope to get some opinion here..currently using strix980.. And a 1080 monitor..
Selling my 980 off and upgrading to 980ti? Or get the acer predator?

Im leaning toward the monitor as my current one isnt great.. If i get the 980ti , could be a waste using it on a dell nom gsync monitor right?
*
i'd rather spend money on acer predator.

980 to 980ti isn't as good as 980 + acer predator.

10-20% fps gains is nothing compared to 144hz refresh rate, gsync, 1440p resolution, 27'' IPS !!!

besides if you already got a 980 why get a 980ti ? when they just reported that pascal is next year 2016 with double the transistor count and most likely to have gains of 50-60% over a titan x ?

anyway get the acer predator you won't regret flex.gif

yaphong
post Jul 28 2015, 10:06 PM

On my way
****
Senior Member
659 posts

Joined: Apr 2005


QUOTE(Moogle Stiltzkin @ Jul 28 2015, 02:30 PM)
i'd rather spend money on acer predator.

980 to 980ti isn't as good as 980 + acer predator.

10-20% fps gains is nothing compared to 144hz refresh rate, gsync, 1440p resolution, 27'' IPS !!!

besides if you already got a 980 why get a 980ti ? when they just reported that pascal is next year 2016 with double the transistor count and most likely to have gains of 50-60% over a titan x ?

anyway get the acer predator you won't regret  flex.gif
*
Yeah but with 980 it is hard to achieve 144 fps at 1440p. Mine is just nice for 60 fps at 1440p for most games at full settings.
yaphong
post Jul 28 2015, 10:09 PM

On my way
****
Senior Member
659 posts

Joined: Apr 2005


QUOTE(takeshiru @ Jul 28 2015, 12:46 PM)
Hello guys..

Hope to get some opinion here..currently using strix980.. And a 1080 monitor..
Selling my 980 off and upgrading to 980ti? Or get the acer predator?

Im leaning toward the monitor as my current one isnt great.. If i get the 980ti , could be a waste using it on a dell nom gsync monitor right?
*
I thought of upgrading my 980 to 980Ti too. However even if I can sell of my current card at RM1900, I still need to top up additional RM1400 to get 980Ti Strix (Strix to Strix for fair comparison) and this is just for about 20% to 40% fps gain. For RM1400 I think it is much better to spend on PS4 hahaha
SSJBen
post Jul 28 2015, 11:25 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Moogle Stiltzkin @ Jul 28 2015, 02:30 PM)
i'd rather spend money on acer predator.

980 to 980ti isn't as good as 980 + acer predator.

10-20% fps gains is nothing compared to 144hz refresh rate, gsync, 1440p resolution, 27'' IPS !!!

besides if you already got a 980 why get a 980ti ? when they just reported that pascal is next year 2016 with double the transistor count and most likely to have gains of 50-60% over a titan x ?

anyway get the acer predator you won't regret  flex.gif
*
Problem with Acer? Great panel, nice tech. But the cheapest ass looking glossy plastic and bezels as thick as 2008 monitors. Oh and don't get me started on the QC.... sweat.gif
Over RM2k+ for a monitor, it is only right to expect something much better. It barely costs half as much to produce the monitor.

This is why I want Dell and LG to come up with their Gsync monitors. LG already makes their own IPS panels anyway, their curved screens has been doing pretty decent considering how niche that market is. So what the hell is stopping them?

This post has been edited by SSJBen: Jul 28 2015, 11:26 PM
takeshiru
post Jul 29 2015, 01:39 AM

Getting Started
**
Junior Member
91 posts

Joined: Jun 2010
QUOTE(Moogle Stiltzkin @ Jul 28 2015, 02:30 PM)
i'd rather spend money on acer predator.

980 to 980ti isn't as good as 980 + acer predator.

10-20% fps gains is nothing compared to 144hz refresh rate, gsync, 1440p resolution, 27'' IPS !!!

besides if you already got a 980 why get a 980ti ? when they just reported that pascal is next year 2016 with double the transistor count and most likely to have gains of 50-60% over a titan x ?

anyway get the acer predator you won't regret  flex.gif
*
Yea strongly agree too..but acer Qc seems questionable.. But the best ips so far..
takeshiru
post Jul 29 2015, 01:41 AM

Getting Started
**
Junior Member
91 posts

Joined: Jun 2010
QUOTE(yaphong @ Jul 28 2015, 10:09 PM)
I thought of upgrading my 980 to 980Ti too. However even if I can sell of my current card at RM1900, I still need to top up additional RM1400 to get 980Ti Strix (Strix to Strix for fair comparison) and this is just for about 20% to 40% fps gain. For RM1400 I think it is much better to spend on PS4 hahaha
*
Test market with mine recently..lucky if u get 1.8k with it..haha usually lower..
Moogle Stiltzkin
post Jul 29 2015, 02:00 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(SSJBen @ Jul 28 2015, 11:25 PM)
Problem with Acer? Great panel, nice tech. But the cheapest ass looking glossy plastic and bezels as thick as 2008 monitors. Oh and don't get me started on the QC....  sweat.gif
Over RM2k+ for a monitor, it is only right to expect something much better. It barely costs half as much to produce the monitor.

This is why I want Dell and LG to come up with their Gsync monitors. LG already makes their own IPS panels anyway, their curved screens has been doing pretty decent considering how niche that market is. So what the hell is stopping them?
*
i thought the glossy part was only the frame not the actual screen hmm.gif

if screen also glossy i might reconsider :/ cause that would be friggin annoying... (not fan of glossy).


*update

there i double checked

QUOTE
Panel Coating
Light AG coating



QUOTE
Glossy black bezel and stand, some red trim on base



QUOTE
Anti-Glare Coating (AG)

The most common type of protective coating is ‘Anti-Glare’ (AG). This is often described as a matte coating as it is non-reflective to the user since it diffuses rather than reflects ambient light. It provides a method for manufacturers to avoid glare on the viewing surface from other light sources and has been used in the LCD monitor market for many years since the first TFT displays started to emerge.

This matte coating is included as an outer polarizing later which has been coarsened by mechanical or chemical processes. This achieves a surface coating which is not smooth and so can diffuse ambient light rather than reflect it. What is particularly important to understand is that this AG coating can be applied to panels with varying thicknesses, which has an impact on the anti-glare properties, but also on the underlying image of the screen. Where the coating is particularly thick and aggressive, the image from the screen can deteriorate as the light being emitted is also affected. This can have some impact on contrast and colour vibrancy and the perceived image can sometimes look dull as a result. Sharpness degradation can also occur in some extreme cases where AG coating is too thick. Users may also sometimes see the graininess of the coating, particularly when viewing white or light backgrounds. This can be particularly distracting for office work and images can look grainy or dirty if the coating is too aggressive. I would point out that not everyone would even notice this at all, and many users are perfectly happy with their screens even where aggressive AG is used. It’s just something to be wary of in case you have found problems with image quality in the past or are susceptible to it.

In other cases, AG coating is applied but it is light and far less obtrusive. The polarizer is less rough and has a lower haze value. Sometimes users refer to it as “semi-gloss” to distinguish the difference between these and the heavy AG coatings. This provides anti-glare properties but does not result in the grainy appearance of images. It is not a fully glossy solution though.

AG coating has long been the choice for nearly all professional-grade displays as well, helping to avoid issues with reflections and external light sources which are vital for colour critical work. In addition it should be noted that AG coating is less susceptible to dust, grease and dirt marks which can become an issue on reflective glossy coating alternatives.

http://www.tftcentral.co.uk/articles/panel_coating.htm



so the only issue is the bezel and stand. but honestly, as long as the panel itself is matte then i personally is that big a deal. Of course if another brand came up with a similar specs and performed just as well, but none of that glossy nonsense (when will they learn doh.gif ) then yeah i would recommend that instead.

but till then this is the way to go 2016 notworthy.gif


PS:
according to online sources this particular model also supports internal programmable 14-bit 3D lookup tables (LUTs) for calibration. so you can caliberate your settings directly into the monitor, so even video sources like when watching mpc would benefit from calibrating this monitor.

the only color cons i can think for this monitor is it uses the cheaper w-led, rather than gb-r led. Though considering everything else minus the glossy bezel/stand, i still think it's still way better than a tn panel. and it's got 14bit internal lut which is also to me important to have as bare minimum :} *need calibrator though





This post has been edited by Moogle Stiltzkin: Jul 29 2015, 02:12 AM
Maxieos
post Jul 29 2015, 02:39 AM

Look at all my stars!!
*******
Senior Member
3,754 posts

Joined: May 2008
May I know what's the different upgrading driver by going to nvidia website and download compare to going to windows updates and update the intel and nvidia driver ?
Moogle Stiltzkin
post Jul 29 2015, 09:17 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(Maxieos @ Jul 29 2015, 02:39 AM)
May I know what's the different upgrading driver by going to nvidia website and download compare to going to windows updates and update the intel and nvidia driver ?
*
i use this app to uninstall drivers
http://www.guru3d.com/files-details/displa...r-download.html


but before that i already downloaded the drivers first from
http://www.geforce.com/drivers


so in the DDU process it reboots you safe mode, then proceeds to uninstall the driver. then it reboots back to windows.

now can safely install from the drivers downloaded direct from nvidia doh.gif


Moogle Stiltzkin
post Jul 29 2015, 09:31 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
this is a very good article concerning why the nvidia camp has gameworks, and hows it's impacting the amd camp
http://wccftech.com/exclusive-nvidias-amds...s-bottom-issue/


i notice though in benchmarks, gameworks tend to decrease fps by a lot. the benchmark i saw was in 4k, where playable settings were on medium to high, and with gameworks disabled doh.gif


QUOTE
What GameWorks Is And Why It Exists

GameWorks is a developer program set-up by Nvidia to provide game developers with a collection of graphics libraries and tools aimed at improving the visual quality of games. It includes technologies such as PhysX – Nvidia’s proprietary physics engine – as well as VisualFX which encompasses a number of Nvidia optimized rendering techniques and in-game visual effects. These include things like shadows, anti-aliasing, depth of field, global illumination, hair simulation, ambient occlusion, lighting and other effects.
Below you will find the VisualFX solutions as listed on Nvidia’s website.

VisualFX provides solutions for rendering and effects including:

HBAO+ Enhanced Horizon Based Ambient Occlusion
TXAA Temporal Anti-aliasing
Soft Shadows Improves on PCSS to reach new levels of quality and performance, with the ability to render cascaded shadow maps
Depth of Field Combination of diffusion based DOF and a fixed cost constant size bokeh effect
FaceWorks Library for implementing high-quality skin and eye shading
WaveWorks Cinematic-quality ocean simulation for interactive applications
HairWorks Enabling simulation and rendering of fur, hair and anything with fibers
GI Works Adding Global Illumination greatly improves the realism of the rendered image
Turbulence High definition smoke and fog with physical interaction as well as supernatural effects



i think gameworks is a good idea. but only games that added features will have them. the only game i know that extensively revolved around gameworks was lords of fallen. but would other games e.g. build on frostbite, cry engine, unreal engine etc ?


QUOTE
One of the discussion points that Brian and I talked about which did not make its way to the interview was about the balance of trading performance for added visual quality, and how much performance was being lost for some GameWorks features such as HairWorks.

I argued that due to how intensive HairWorks is at tessellation factor x32 ( the standard setting for the feature ) many gamers, including ones using NVIDIA hardware, may choose to disable it completely in favor of improving the overall performance of the game. And that those gamers would have probably appreciated the choice of toggling between different tessellation factors just as they would normally toggle between different gradients of any other effect such as shadows or anti aliasing. Rather they are forced to accept x32, which is very intensive, or nothing at all.



This post has been edited by Moogle Stiltzkin: Jul 29 2015, 10:08 AM
yaphong
post Jul 29 2015, 09:46 AM

On my way
****
Senior Member
659 posts

Joined: Apr 2005


QUOTE(takeshiru @ Jul 29 2015, 01:41 AM)
Test market with mine recently..lucky if u get 1.8k with it..haha usually lower..
*
That's why I think it is not worth upgrading it. Perhaps Pascal will offer much more improvement later...
Moogle Stiltzkin
post Jul 29 2015, 02:09 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
just updated to windows 10.... now waiting for dx 12 games to come out tongue.gif lel
AchievaTech
post Jul 29 2015, 02:31 PM

New Member
*
Newbie
3 posts

Joined: Jan 2015


QUOTE(shikimori @ Jul 24 2015, 11:26 PM)
quite a nice price sad.gif too bad im in penang .

Which one is better freesync or g-sync ? From what I heard freesync suffers from ghosting and limitation for refresh rate for example this mg279q is capped at 35hz-90hz

Gsync is capable of rates that range from 30Hz to 144Hz  cry.gif 


Taken from Wccftech
Read more: http://wccftech.com/amd-freesync-nvidia-gs.../#ixzz3gp5KJUp9
*
All you need is just ask ask around smile.gif

If you are interested to book one, i would suggest you make it fast as the shipment is coming in Mid August.
You can contact Gtech Solutions (Butterworth) as they do carry PowerColor products.


SUScrash123
post Jul 29 2015, 02:46 PM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
Hi guys..selling my Zotac GTX 980 ti brows.gif
click here
TSskylinelover
post Jul 29 2015, 03:44 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(crash123 @ Jul 29 2015, 02:46 PM)
Hi guys..selling my Zotac GTX 980 ti  brows.gif
click here
*
Sheeet so fast laugh.gif
TSskylinelover
post Jul 29 2015, 03:48 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(SSJBen @ Jul 28 2015, 11:25 PM)
Problem with Acer? Great panel, nice tech. But the cheapest ass looking glossy plastic and bezels as thick as 2008 monitors. Oh and don't get me started on the QC....  sweat.gif
Over RM2k+ for a monitor, it is only right to expect something much better. It barely costs half as much to produce the monitor.

This is why I want Dell and LG to come up with their Gsync monitors. LG already makes their own IPS panels anyway, their curved screens has been doing pretty decent considering how niche that market is. So what the hell is stopping them?
*
Haha same here

Dell fanboy reporting in laugh.gif rclxms.gif
SUScrash123
post Jul 29 2015, 03:48 PM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
QUOTE(skylinelover @ Jul 29 2015, 03:44 PM)
Sheeet so fast laugh.gif
*
haha icon_rolleyes.gif
SSJBen
post Jul 29 2015, 05:32 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Moogle Stiltzkin @ Jul 29 2015, 02:00 AM)
i thought the glossy part was only the frame not the actual screen  hmm.gif

if screen also glossy i might reconsider :/ cause that would be friggin annoying... (not fan of glossy).
*update

there i double checked
http://www.tftcentral.co.uk/articles/panel_coating.htm
so the only issue is the bezel and stand. but honestly, as long as the panel itself is matte then i personally is that big a deal. Of course if another brand came up with a similar specs and performed just as well, but none of that glossy nonsense (when will they learn  doh.gif ) then yeah i would recommend that instead.

but till then this is the way to go 2016  notworthy.gif

PS:
according to online sources this particular model also supports internal programmable 14-bit 3D lookup tables (LUTs) for calibration. so you can caliberate your settings directly into the monitor, so even video sources like when watching mpc would benefit from calibrating this monitor.

the only color cons i can think  for this monitor is it uses the cheaper w-led, rather than gb-r led. Though considering everything else minus the glossy bezel/stand, i still think it's still way better than a tn panel. and it's got 14bit internal lut which is also to me important to have as bare minimum :}  *need calibrator though
*
Oh yes, I do mean the bezels and the cheap stand that wobbles pretty badly. It may not seem like an issue, but the glossy bezels do cause reflection and personally, it is extremely annoying for me. I know, because I owned several Samsung and Viewsonic monitors back from 2009 when glossy was like icing on the cake, everyone must make something glossy. doh.gif

I agree the panel is indeed very good, there's really not much qualms about it. AUO has been doing a great job lately, over the last few years. They've really stepped up their game.

But like I said, the issue lies with Acer's QC. You can read up on other forums like reddit, neogaf, OCN even, there are numerous issues like terrible backlight bleed that just happens over a few weeks of usage. There many stories on dead pixels too, pretty much unacceptable for a monitor of such high price.

Moogle Stiltzkin
post Jul 29 2015, 05:37 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(SSJBen @ Jul 29 2015, 05:32 PM)
Oh yes, I do mean the bezels and the cheap stand that wobbles pretty badly. It may not seem like an issue, but the glossy bezels do cause reflection and personally, it is extremely annoying for me. I know, because I owned several Samsung and Viewsonic monitors back from 2009 when glossy was like icing on the cake, everyone must make something glossy. doh.gif

I agree the panel is indeed very good, there's really not much qualms about it. AUO has been doing a great job lately, over the last few years. They've really stepped up their game.

But like I said, the issue lies with Acer's QC. You can read up on other forums like reddit, neogaf, OCN even, there are numerous issues like terrible backlight bleed that just happens over a few weeks of usage. There many stories on dead pixels too, pretty much unacceptable for a monitor of such high price.
*
ok the backlight bleed i didn't look into. but depending on severity could be a major reason not to get it.

my own monitor has it in the bottom right corner, but it's not too bad, so i give it a pass.

but definitely prospective buyers need to check on this doh.gif
Maxieos
post Jul 29 2015, 07:35 PM

Look at all my stars!!
*******
Senior Member
3,754 posts

Joined: May 2008
Moogle Stiltzkin
sorry , can explain the step again ?

The pc having both driver , one is intel and another is geforce.

The DDU driver uninstall tools cannot uninstall Intel driver right ?

Installing at Window 7 update driver at the optional section better or Using DDU better ? or just download the nvidia driver and get it install ?

Same graphic , didn't change anything , just wanted to update for some extra features

Moogle Stiltzkin
post Jul 29 2015, 07:53 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(Maxieos @ Jul 29 2015, 07:35 PM)
Moogle Stiltzkin
sorry , can explain the step again ?

The pc having both driver , one is intel and another is geforce.

The DDU driver uninstall tools cannot uninstall Intel driver right ?

Installing at Window 7 update driver at the optional section better or Using DDU better ? or just download the nvidia driver and get it install ?

Same graphic , didn't change anything , just wanted to update for some extra features
*
ddu can uninstall intel, nvidia and amd graphics smile.gif

step 1
download ddu

step2
install ddu

step 3
open ddu (it ask you to open safe mode, say yes)

step4
when in safe mode it auto open ddu. just select uninstall whatever brand graphics driver your using. for laptop sometimes you got both intel and nvidia. so you do accordingly lah ;x

step5
reboot

step6
now install nvidia driver. make sure you select custom/ clean installation. usually i have this downloaded before i even begin the process. cause if internet cannot access mampus lah laugh.gif

step7
sometimes for fun, i'll also use ccleaner to clean up registry smile.gif though probably not needed, because DDU does all that related to the graphics left over files.

This post has been edited by Moogle Stiltzkin: Jul 29 2015, 07:54 PM
SSJBen
post Jul 29 2015, 11:16 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


One other tip.

If you don't use Nvidia HD audio, untick it.
If you don't use GFE and Shadowplay, untick it.
If you don't use 3D, untick it.

Seriously, the only things that are absolutely needed is the display driver and PhysX, everything else is optional.
People keep getting into errors because they just press next blindly.
TSskylinelover
post Jul 30 2015, 12:36 AM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(SSJBen @ Jul 29 2015, 11:16 PM)
Seriously, the only things that are absolutely needed is the display driver and PhysX, everything else is optional.
People keep getting into errors because they just press next blindly.
*
haha so true laugh.gif doh.gif
goldfries
post Jul 30 2015, 12:50 AM

40K Club
Group Icon
Forum Admin
44,415 posts

Joined: Jan 2003




QUOTE(SSJBen @ Jul 29 2015, 11:16 PM)
One other tip.

If you don't use Nvidia HD audio, untick it.
If you don't use GFE and Shadowplay, untick it.
If you don't use 3D, untick it.

Seriously, the only things that are absolutely needed is the display driver and PhysX, everything else is optional.
People keep getting into errors because they just press next blindly.
*
Physx is optional too. biggrin.gif
SSJBen
post Jul 30 2015, 02:11 AM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(goldfries @ Jul 30 2015, 12:50 AM)
Physx is optional too. biggrin.gif
*
Well theoretically yes.. but at least its better to have it than not have it for compatibility sake, unlike the other stuff.
ssxcool
post Jul 30 2015, 03:31 AM

Getting Started
**
Junior Member
255 posts

Joined: Sep 2011
guys can recommend me rm1k monitor?

currently using rm300 samsung cap ayam cyber cafe monitor. quality liek shiet
edmund_yung
post Jul 30 2015, 06:00 AM

Regular
******
Senior Member
1,198 posts

Joined: Aug 2009


QUOTE(shikimori @ Jul 24 2015, 09:42 PM)
went full retard and bought Asus MG279Q cannot wait for XB270hu  cry.gif long waiting period (up to 1 month) got Ass Creed Unity and DLC as free game bundle together

user posted image the box kinda retarded as well for missing the Q but its the same model

dunno worth it or not in terms of price . But dayummmm the moment playing BF4 at 144hz feels like playing call of duty , will I miss G-Sync or Freesync ?

I dont know but IPS + 144hz is a whole new experience for me at least .
*
I'm planning to get the Acer from IdealTech next month once I save enough cash. I didn't know about the one month wait, meaning I'll only get the screen in September?
shikimori
post Jul 30 2015, 06:18 AM

Getting Started
**
Junior Member
236 posts

Joined: Jul 2007
From: Penang


QUOTE(edmund_yung @ Jul 30 2015, 06:00 AM)
I'm planning to get the Acer from IdealTech next month once I save enough cash. I didn't know about the one month wait, meaning I'll only get the screen in September?
*
Nah mine from lelong . Not sure about ideal tech leadtime just that their price is expensive compared to lelong
Moogle Stiltzkin
post Jul 30 2015, 10:59 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE
Over the weekend, Gordon Kelly at Forbes posted an article about Microsoft's botched Windows 10 Nvidia driver patch. Advanced Nvidia customers are furious -- for good reason. Thanks to the magic of Windows 10 forced updates, the good, old Nvidia driver was replaced with a new Microsoft-approved version, and all hell broke loose.


http://www.infoworld.com/article/2952996/m...ch-blocker.html


using windows 10 enterprise, i updates to defered. but other win10 users won't have this option.
http://www.extremetech.com/computing/21020...g-term-deferral


instead it would just simply update .....

only way is to use the microsoft tool to manually hide/choose which to install/not install.

so you'll be having to use that tool all the time whenever you want to update..... makes me wonder why they don't bother integrating the tool options into os itself then :{

http://gadgets.ndtv.com/apps/news/microsof...-updates-720758


interestingly i just updated the latest nvidia drivers, but it's still listing a new driver in the windows update..... wtf..


*update

weird now it's not doing that doh.gif guess needed a few minutes to detect the new drivers installed.. lel smile.gif



anyway i think for win10 users, you'll need to download driver, and ddu. then disconnect your internet while your reinstalling the driver. otherwise it will just download install driver ontop of the other automatically doh.gif

This post has been edited by Moogle Stiltzkin: Jul 30 2015, 11:25 AM
cstkl1
post Jul 30 2015, 01:59 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

Waiting for ti users to flood with vram swapping issues with w10.
sai86
post Jul 30 2015, 05:49 PM

StilL LearninG
*******
Senior Member
4,934 posts

Joined: Sep 2008
From: Setapak


QUOTE(cstkl1 @ Jul 30 2015, 01:59 PM)
Waiting for ti users to flood with vram swapping issues with w10.
*
what kind of issue is that? hmm.gif
Moogle Stiltzkin
post Jul 31 2015, 07:19 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Jul 30 2015, 01:59 PM)
Waiting for ti users to flood with vram swapping issues with w10.
*
hearthstone works but

storm of heroes doesn't. had to use compatibility for 8.1 to work, and even then sometimes it may crash after 1-2 hours or so if left on idle hmm.gif
Maxieos
post Aug 1 2015, 02:06 AM

Look at all my stars!!
*******
Senior Member
3,754 posts

Joined: May 2008
Moogle Stiltzkin Thanks for t he guide biggrin.gif

Moogle Stiltzkin
post Aug 1 2015, 02:10 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(Maxieos @ Aug 1 2015, 02:06 AM)
Moogle Stiltzkin Thanks for t he guide biggrin.gif
*
np flex.gif
SUSPh 7.00
post Aug 1 2015, 02:19 AM

Casual
***
Junior Member
309 posts

Joined: Oct 2014
Anyone has tried this vga max setting 1080p on Lord of the Fallen? I think this game really sweats any vga muscle at max setting. Care to comment about the temperature and GPU usage percentage? My gtx770 got reboot automatically many times although resolution has been reduced to 1280x720. sweat.gif Now playing with constant 100% fan speed (with max loudness rclxub.gif ).

This post has been edited by Ph 7.00: Aug 1 2015, 02:19 AM
shikimori
post Aug 1 2015, 06:20 AM

Getting Started
**
Junior Member
236 posts

Joined: Jul 2007
From: Penang


QUOTE(Ph 7.00 @ Aug 1 2015, 02:19 AM)
Anyone has tried this vga max setting 1080p on Lord of the Fallen? I think this game really sweats any vga muscle at max setting. Care to comment about the temperature and GPU usage percentage? My gtx770 got reboot automatically many times although resolution has been reduced to 1280x720.  sweat.gif Now playing with constant 100% fan speed (with max loudness rclxub.gif ).
*
lol the game so old why only now play ? The crashing does it occur only to the Lord of fallen game or any other game as well ?
SUSPh 7.00
post Aug 1 2015, 07:32 AM

Casual
***
Junior Member
309 posts

Joined: Oct 2014
QUOTE(shikimori @ Aug 1 2015, 06:20 AM)
lol the game so old why only now play ?  The crashing does it occur only to the Lord of fallen game or any other game as well ?
*
So far only Lord of the Fallen. I tested Alien Isolation (which I think on par graphic with the LOF hmm.gif ) at max setting 1080p no problem. Witcher 3 as well can run at max setting (minus hair animation) at less than 30fps tou but no sudden reboot occurred so far. So before I invest for the 980Ti, I want to make sure if it really worth and can run the LOF without sweat like my current GTX770 sweat.gif . I'm graphic intensive gamer fyi wub.gif

I was watching the temperature all the time during gameplay (windowed mode), it can spike all at sudden even the event in the game seems not too graphic intensive. scary.
shikimori
post Aug 1 2015, 09:23 AM

Getting Started
**
Junior Member
236 posts

Joined: Jul 2007
From: Penang


QUOTE(Ph 7.00 @ Aug 1 2015, 07:32 AM)
So far only Lord of the Fallen. I tested Alien Isolation (which I think on par graphic with the LOF hmm.gif ) at max setting 1080p no problem. Witcher 3 as well can run at max setting (minus hair animation) at less than 30fps tou but no sudden reboot occurred so far. So before I invest for the 980Ti, I want to make sure if it really worth and can run the LOF without sweat like my current GTX770 sweat.gif . I'm graphic intensive gamer fyi  wub.gif

I was watching the temperature all the time during gameplay (windowed mode), it can spike all at sudden even the event in the game seems not too graphic intensive. scary.
*
I think maybe its about time to upgrade to maxwell but at 1080p but 970 should be fine no need to waste on 980ti
unless you are on 1440p I can only manage 72 fps T_T while 4k via dsr at 32 fps ultra settings
Moogle Stiltzkin
post Aug 1 2015, 10:27 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
just wondering but what kind of performance upgrade can i expect going from a kepler 680gtx to a pascal ?

hmm.gif a rough guess.


By the way, an interesting comment by amd developer

QUOTE
"Nvidia Gameworks typically damages the performance on Nvidia hardware as well, which is a bit tragic really," he told PCR. "It certainly feels like it’s about reducing the performance, even on high-end graphics cards, so that people have to buy something new.


this i didn't really care because i usually get the high end cards anyway (minus titan x level rolleyes.gif )


QUOTE
“If you look at Crysis 2 in particular, you see that they’re tessellating water that’s not visible to millions of triangles every frame, and they’re tessellating blocks of concrete – essentially large rectangular objects – and generating millions of triangles per frame which are useless.


but this caught my eye though.


is there any basis on that claim i highlighted ? or is there a valid reason it's done that way. any IQ improvements or technical limitations for not doing this better ?

http://www.mcvuk.com/news/read/amd-accuses...amaging/0153557


it's a shame that these 2 camps don't have a similar agreement like intel did with amd for amd64 for intel instruction sets (in gpu case nvidia gameworks, gsync and source code for amds.... not sure what lel... freesync ? rolleyes.gif ), but sadly it's a reality we have to live with :{

but still i'd like to know regarding what i mentioned doh.gif cause are they intentionally gimping performance on purpose to get us to upgrade ?


This post has been edited by Moogle Stiltzkin: Aug 1 2015, 10:34 AM
TSskylinelover
post Aug 1 2015, 10:30 AM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(shikimori @ Aug 1 2015, 09:23 AM)
unless you are on 1440p I can only manage 72 fps T_T  while 4k via dsr at 32 fps ultra settings
*
Haha. This like back to crysis 1 days again. The 8800GTX/ULTRA only can run 30+fps at 1680*1050 full resolution, the benchmark display at that time. Cannot imagine how 5 inch extra today makes hell a lot of difference. laugh.gif rclxms.gif
marfccy
post Aug 1 2015, 02:21 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(Moogle Stiltzkin @ Aug 1 2015, 10:27 AM)
just wondering but what kind of performance upgrade can i expect going from a kepler 680gtx to a pascal ?

hmm.gif  a rough guess.
By the way, an interesting comment by amd developer
this i didn't really care because i usually get the high end cards anyway (minus titan x level  rolleyes.gif  )
but this caught my eye though.
is there any basis on that claim i highlighted ? or is there a valid reason it's done that way. any IQ improvements or technical limitations for not doing this better ?

http://www.mcvuk.com/news/read/amd-accuses...amaging/0153557
it's a shame that these 2 camps don't have a similar agreement like intel did with amd for amd64 for intel instruction sets (in gpu case nvidia gameworks, gsync and source code for amds.... not sure what lel... freesync ?  rolleyes.gif ), but sadly it's a reality we have to live with :{

but still i'd like to know regarding what i mentioned doh.gif cause are they intentionally gimping performance on purpose to get us to upgrade ?
*
itll soon be forgotten, remember at start when TressFX was intensive as heck?

now its nothing much dee
Moogle Stiltzkin
post Aug 1 2015, 04:11 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
found my answer

user posted image

user posted image

user posted image


doh.gif so i definitely can ultra + msaa x2 or 4 + hairworks with a pascal :} on 1920x1200

http://www.anandtech.com/show/9306/the-nvi...980-ti-review/8

This post has been edited by Moogle Stiltzkin: Aug 1 2015, 04:12 PM
Killmeplsok
post Aug 1 2015, 05:21 PM

Getting Started
**
Junior Member
161 posts

Joined: Apr 2010
QUOTE(marfccy @ Aug 1 2015, 02:21 PM)
itll soon be forgotten, remember at start when TressFX was intensive as heck?

now its nothing much dee
*
Well, TressFX later became open sourced so Nvidia could optimize for it, same cannot be said for tessellation heavy frameworks such as hairworks though.
marfccy
post Aug 1 2015, 05:30 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(Killmeplsok @ Aug 1 2015, 05:21 PM)
Well, TressFX later became open sourced so Nvidia could optimize for it, same cannot be said for tessellation heavy frameworks such as hairworks though.
*
the same can be said for AMD cards, their new archi could have way way improved tessellation than the previous, making it less troublesome

its just now where AMD are still not tessellation optimised
TSskylinelover
post Aug 1 2015, 09:03 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(Moogle Stiltzkin @ Aug 1 2015, 04:11 PM)
found my answer

user posted image

user posted image

user posted image
doh.gif so i definitely can ultra + msaa x2 or 4 + hairworks with a pascal :} on 1920x1200

http://www.anandtech.com/show/9306/the-nvi...980-ti-review/8
*
Haha. Now that made me want 2 jump over 4k faster. I sien 1440p already after only 11 months using just games and movies. laugh.gif doh.gif

unequalteck
post Aug 3 2015, 12:17 AM

Custom member title
*******
Senior Member
2,690 posts

Joined: Dec 2008
From: Kota Kinabalu Current: Wangsa Maju


980ti reference under water.
3dmark max 46c <3 so much better


Attached thumbnail(s)
Attached Image
edmund_yung
post Aug 3 2015, 02:20 AM

Regular
******
Senior Member
1,198 posts

Joined: Aug 2009


I'm looking for 980ti reference but many place don't stock them anymore, mostly non-ref out there. Anyone tried using a 980ti non-ref in a Ncase M1?

Edit: Nvm, got the ref already.

This post has been edited by edmund_yung: Aug 4 2015, 02:05 AM
SUScrash123
post Aug 4 2015, 04:35 PM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
At last..after hard decision..buy another card flex.gif

This post has been edited by crash123: Aug 4 2015, 04:35 PM


Attached thumbnail(s)
Attached Image Attached Image
Minecrafter
post Aug 4 2015, 04:40 PM

ROCK N ROLL STAR
*******
Senior Member
5,043 posts

Joined: Aug 2013
From: Putrajaya


QUOTE(crash123 @ Aug 4 2015, 04:35 PM)
At last..after hard decision..buy another card  flex.gif
*
Just wondering...why did you sold the GTX980Ti before? tongue.gif
SUScrash123
post Aug 4 2015, 04:42 PM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
QUOTE(Minecrafter @ Aug 4 2015, 04:40 PM)
Just wondering...why did you sold the GTX980Ti before? tongue.gif
*
fan to loud..wanna build silent rig drool.gif
eatsleepnDIE
post Aug 4 2015, 04:49 PM

Getting Started
**
Junior Member
168 posts

Joined: Nov 2007


QUOTE(unequalteck @ Aug 3 2015, 12:17 AM)
980ti reference under water.
3dmark max 46c <3 so much better
*
errr...you water cool it or put it "under" the water?

care to give pics or how to do?
Moogle Stiltzkin
post Aug 4 2015, 06:40 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(eatsleepnDIE @ Aug 4 2015, 04:49 PM)
errr...you water cool it or put it "under" the water?

care to give pics or how to do?
*
think he means water cooled doh.gif like this
user posted image


anything under 60c for gpu while on max performance is decent in my book :}


if pascal comes out with a small size due to hbm, maybe the gpu waterblock will also be equally smaller hence cheaper i hope hmm.gif


i respect amd for at least trying to add default water cool to their fiji, unfortunately it had noise issues.... they trusted the wrong partner (cooler master from i heard rolleyes.gif )


i personally use water cooling, and haven't had a single leak since my pc was build back in 2012 doh.gif but i use quality threads and water cooling parts. So the experience is probably different if you get bad parts smile.gif

posted my water cooling diary here sweat.gif
https://disqus.com/home/discussion/wccftech...ment-2160523623


PS: water cooled gpus tend to have priced marked up by 600 more + a regular gpu due to the water block. So everytime you upgrade gpu your paying substantially more just to get water cooling for that part .....

water cooling is expensive because some of the parts particularly the water blocks are not interchangeable with other new hardware doh.gif

This post has been edited by Moogle Stiltzkin: Aug 4 2015, 06:48 PM
unequalteck
post Aug 4 2015, 09:12 PM

Custom member title
*******
Senior Member
2,690 posts

Joined: Dec 2008
From: Kota Kinabalu Current: Wangsa Maju


QUOTE(eatsleepnDIE @ Aug 4 2015, 04:49 PM)
errr...you water cool it or put it "under" the water?

care to give pics or how to do?
*
i thought everyone saying like this also? under air/under water = air cooling/water cooling
eatsleepnDIE
post Aug 4 2015, 10:04 PM

Getting Started
**
Junior Member
168 posts

Joined: Nov 2007


QUOTE(unequalteck @ Aug 4 2015, 09:12 PM)
i thought everyone saying like this also? under air/under water = air cooling/water cooling
*
sorry..noobs here lol tongue.gif
Moogle Stiltzkin
post Aug 5 2015, 09:07 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(llk @ Jul 8 2015, 08:55 PM)
Thinking of watercool my gtx980ti, our ringgit so low the block +backplate nearly cost RM700
*
gst and weak currency.... not really best time for us consumers >_<;

ironically when we become frugal with spending as a whole, that just digs us into further trouble rclxub.gif it's a cycle.

700 is alot :{




just see this


using the Amd R9 Fury x as an example because of the size of the card using hbm (hopefully pascal is this size)
https://shop.ekwb.com/ek-fc-r9-fury-x

113.99 USD = 441.829 MYR

1 USD = 3.87603 MYR



so the pascal would be somewhere around 2400 ish + 450 ish. Still better than 700 ish :/


If our currency becomes 1:4 thats gonna hurt cry.gif




the 980ti water block not much difference in price


EK-FC Titan X - Nickel
https://shop.ekwb.com/ek-fc-titan-x-nickel



125.49 USD = 486.243 MYR
user posted image



PS: never mix metal whens possible. as water circulates other time it will rust..... and thats whats gonna kill your radiator by making holes. So if you do have mixed, then your gonna have to use anti rust coolant as a safety precaution.


QUOTE
Short answer:
Nickel plated blocks are slightly more corosion resistant.

Long answer:
As stated nickel is corrosion resistant. Also as stated copper is the best option for a heat sink. Nickel plated copper blocks do only have a thin layer of nickel.

The major benefit of a nickel plated copper water block is that if the surface to be mounted against the cpu is nickel plated it will most certainly be mirror smooth and will allow the thermal grease to flow without entrapping air bubbles. A non plated block could also be polished on the mating surface but will tarnish upon exposure to air so it would have to be done immediately before mounting and any buffing compound would have to be removed.

For corrosion resistance of an uncoated copper water block you could simply use some clear coat on the externally exposed copper except for the mating surface. Assuming you don't have large quantities of dissimilar metals in your system internal corrosion wouldn't be an issue. Keep in mind that any two dissimilar metals in contact or in the presence of the same electrolyte(A liquid which supports the flow of electrons) will incur a galvanic reaction. Most water based solutions are an electrlyte to some extent especially when you factor in contaminants.

If I were to design a small scale cooling system I would use all copper and solder which is meant for it to limit corrosion. Silver is a natural anti-microbial and would inhibit mould growth. Many potable copper pipe solders are silver bearing. The use of an automobile heater core which would be almost all copper would be perfect as a radiator. Standard car coolant of 50/50 mix would be fine as a coolant.
This post has been edited by Moogle Stiltzkin: Aug 5 2015, 09:21 AM
eatsleepnDIE
post Aug 5 2015, 10:28 AM

Getting Started
**
Junior Member
168 posts

Joined: Nov 2007


» Click to show Spoiler - click again to hide... «


sifu wanna ask, what is the different between Acetal+Nickel waterblock and just the Acetal waterblock? i read it is the same performance is it true? do i need the anti corrosion coolant?

and on ek website, it says the block designed to accommodate a weak pump, so if i want to cool just my gc, suffice for me to get the cheapest and weakest pump right?

thanks in advance
Moogle Stiltzkin
post Aug 5 2015, 11:00 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(eatsleepnDIE @ Aug 5 2015, 10:28 AM)
» Click to show Spoiler - click again to hide... «


sifu wanna ask, what is the different between Acetal+Nickel waterblock and just the Acetal waterblock? i read it is the same performance is it true? do i need the anti corrosion coolant?

and on ek website, it says the block designed to accommodate a weak pump, so if i want to cool just my gc, suffice for me to get the cheapest and weakest pump right?

thanks in advance
*
not really an expert but i am vaguely aware of the galvanic issue doh.gif

QUOTE
Even liquid with a corrosion inhibitor won't last long. You'll likely be changing coolant very often. It's best not to use them together at all.

The most damage would be to the aluminum radiator, not so much the block (which is unfortunate, because you can't see the damage inside the rad). Eventually, you'll probably end up with pinhole leaks in your rad.

I'd recommend getting a copper radiator or a nickel plated block..... or a lot of anti-corrosive liquid.

Liquid usually turns into a milky color when corrosion is occurring in your loop.

http://www.overclock.net/t/1222245/how-to-...orrosion-alu-cu



i'm using a copper radiator and cpu waterblock


1 x Feser X-Changer Triple 360 mm Extreme Performance Radiator - 15mm Spacing
user posted image

1 x Swiftech Apogee XT CPU Waterblock
user posted image


unfortunately when i bought my 680gtx waterblock i forgot to check the metal type.... i ended up getting/using a nickle/acetal. however thankfully...

QUOTE
There is no difference really.

Nickel plated and full nickel are the same thing just worded differently.
Acetal is a type of plastic that the top is made of, it will either be acrylic or acetal.

EN nickel is a newer type of nickel plating that EK uses and is supposed to reduce the likeliness of the nickel flaking.

Copper and nickel are almost the same chemically so there isn't any problem mixing them.

EK only sells EN nickel now as they had some major problems with the nickel plating flaking off.

Some people will say not to use EK due to the nickel flaking problems which still seems to happen sometimes with the EN but I have had no problems with mine (running since May)


http://www.overclock.net/t/1342447/differe...and-full-nickel

smile.gif phew dodged a bullet



anyway i use plain distilled water.
QUOTE
Distilled water is water that has many of its impurities removed through distillation. Distillation involves boiling the water and then condensing the steam into a clean container.
i prefer a simple setup. Sure i could have put anti rust, but that takes maintenance to change out..... and even worse is colored coolant because overtime it becomes gunk like slimish. Needs to take out the whole radiator to shake out the slime, and replace......

thats why simple effective setup is best.


I use 2 of this 99.9% silver kill coil inside my water reservoir to prevent algae from growing
user posted image


and this tygon silver lined tubing
user posted image




water cool cpu and gpu makes the most sense. you can reduce the rev up noise from fans when pc is underload.... also good for pcs like mine which is left on 24/7

and even though i use 6 radiators on my radiator, i set to low medium, can hardly hear pc make noise at all doh.gif


eatsleepnDIE
post Aug 5 2015, 12:22 PM

Getting Started
**
Junior Member
168 posts

Joined: Nov 2007


» Click to show Spoiler - click again to hide... «


so much info! thanks a lot! now i think i have more confidence in starting water cooled setup biggrin.gif
TSskylinelover
post Aug 5 2015, 12:25 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
Haha interesting
Moogle Stiltzkin
post Aug 5 2015, 12:35 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(eatsleepnDIE @ Aug 5 2015, 12:22 PM)
» Click to show Spoiler - click again to hide... «


so much info! thanks a lot! now i think i have more confidence in starting water cooled setup biggrin.gif
*
if you want to save money, maybe can skip out on gpu waterblock (which is much harder to support newer gpus in future).

and just get cpu waterblock only. depends how much you willing to spend every 4-6 years when your replacing your gear (for which you may need to buy a compatible waterblock for the part as well ohmy.gif )


yeah moogle... spreading the poison whether it be mechnical keyboards or water cooling laugh.gif

user posted image



by the way the latest game i'm playing lately is magic the gathering duel origins. this game even at ultra settings the aliasing is so bad.... despite having some good still graphic images ..... it's game developers that do crap like this that makes me wonder even bother get top of the line graphics card doh.gif

This post has been edited by Moogle Stiltzkin: Aug 5 2015, 12:37 PM
marfccy
post Aug 5 2015, 03:22 PM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(Moogle Stiltzkin @ Aug 5 2015, 11:00 AM)
» Click to show Spoiler - click again to hide... «

*
this pretty much explains the dark side of watercooling, sure its cool as heck, and your PC runs better, but you had to deal with more maintenance if you pick the wrong parts
eatsleepnDIE
post Aug 5 2015, 03:23 PM

Getting Started
**
Junior Member
168 posts

Joined: Nov 2007


» Click to show Spoiler - click again to hide... «


lol i dont care about my cpu (well actually i do but less then my gpu) coz i could buy an aio for it and forget about it for at least a year...

but for my gpu...it is hard to make it cool (it is cool and hot at the same time if you catch my drift tongue.gif ) coz a complete water cooled setup will require a maintenance which im too lazy to do...so i guess the easiest path is aio with the gpu bracket like kraken g10 or evga hybrid solution.

all in all, i hope malaysia will get snow soon so i could oc my gpu to crazy level with stock cooler lol.
SSJBen
post Aug 5 2015, 03:55 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(eatsleepnDIE @ Aug 5 2015, 03:23 PM)
» Click to show Spoiler - click again to hide... «


lol i dont care about my cpu (well actually i do but less then my gpu) coz i could buy an aio for it and forget about it for at least a year...

but for my gpu...it is hard to make it cool (it is cool and hot at the same time if you catch my drift  tongue.gif  ) coz a complete water cooled setup will require a maintenance which im too lazy to do...so i guess the easiest path is aio with the gpu bracket like kraken g10 or evga hybrid solution.

all in all, i hope malaysia will get snow soon so i could oc my gpu to crazy level with stock cooler lol.
*
The day Malaysia gets snow is when the world will perish... logically and scientifically speaking.

But on a more on-topic note, I don't know where the assumption comes from if snow = super low temps for PC? Okay sure, definitely lower than in hot countries but the perception of living in a cold country and having low temps on PC is wrong. I stayed in the US during winter, several times. Guess what? We have to turn on the heaters.

The house is actually dryer and hotter with the heaters than in Malaysia where you'd just have the fan on instead. Ironic? Yeah.

Don't turn on the heaters? Well, if you enjoy wearing 3 layers of clothes, then be my guest. biggrin.gif


Also, one of the main reason for going water is for looks. I don't know why people found it pleasing to have 2 AIOs in their system, it's so tacky and n00bish looking. I mean, I'm not trying to even sound like a snob here but it is what it is. Might as well just optimize for air cooling instead, because good air cooling is nearly as effective as AIOs (in some cases, even better), cost less and guess what? It won't break (where as AIOs will, just read up the numerous amount of stories about AIOs dying prematurely).
Moogle Stiltzkin
post Aug 5 2015, 04:09 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
weird how nvidia gpu is performing different on linux and windows
http://www.phoronix.com/scan.php?page=arti...n10-linux&num=2


*short story

play game only on windows tongue.gif lel



QUOTE(SSJBen @ Aug 5 2015, 03:55 PM)
The day Malaysia gets snow is when the world will perish... logically and scientifically speaking.

But on a more on-topic note, I don't know where the assumption comes from if snow = super low temps for PC? Okay sure, definitely lower than in hot countries but the perception of living in a cold country and having low temps on PC is wrong. I stayed in the US during winter, several times. Guess what? We have to turn on the heaters.

The house is actually dryer and hotter with the heaters than in Malaysia where you'd just have the fan on instead. Ironic? Yeah.

Don't turn on the heaters? Well, if you enjoy wearing 3 layers of clothes, then be my guest. biggrin.gif
Also, one of the main reason for going water is for looks. I don't know why people found it pleasing to have 2 AIOs in their system, it's so tacky and n00bish looking. I mean, I'm not trying to even sound like a snob here but it is what it is. Might as well just optimize for air cooling instead, because good air cooling is nearly as effective as AIOs (in some cases, even better), cost less and guess what? It won't break (where as AIOs will, just read up the numerous amount of stories about AIOs dying prematurely).
*
i'd spend my buck on pc, gpu, hdds, etc etc.... water cooling really is the last resort when you run out of things to upgrade. that and mkb laugh.gif

i understand the temptation of the bling, but in all honestly the primary concern should be the temp. just run a few temp program to track your cpu and gpu temps. if it goes 70-80+ should really go water cooling for sure :]

or even the sound issue. for me this was important :}


This post has been edited by Moogle Stiltzkin: Aug 5 2015, 04:16 PM
eatsleepnDIE
post Aug 5 2015, 04:09 PM

Getting Started
**
Junior Member
168 posts

Joined: Nov 2007


» Click to show Spoiler - click again to hide... «


sir, i am a NOOB and my chassis is mitx type so i dont care about the internal looks tongue.gif

errr...i thought it is normal to turn on the heater during winter? because outside is cold and you want the warmth inside the house. Well yeah the wishing for the snow to cool the rig are plain stupid so I apologise for that lol.

I dont know about the AIO dying prematurely, never occurred to me thank god for that but I still favored the AIO coz the simpleness and troubled free (again this may varied).

Again, this is my current opinion, maybe will change later tongue.gif
SSJBen
post Aug 5 2015, 06:03 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Moogle Stiltzkin @ Aug 5 2015, 04:09 PM)
weird how nvidia gpu is performing different on linux and windows
http://www.phoronix.com/scan.php?page=arti...n10-linux&num=2
*short story

play game only on windows tongue.gif lel
i'd spend my buck on pc, gpu, hdds, etc etc.... water cooling really is the last resort when you run out of things to upgrade. that and mkb  laugh.gif

i understand the temptation of the bling, but in all honestly the primary concern should be the temp. just run a few temp program to track your cpu and gpu temps. if it goes 70-80+ should really go water cooling for sure :]

or even the sound issue. for me this was important :}
*
There are 3 main reasons for water, temps obviously, looks and noise.

I personally don't like AIOs at all. They're inconsistent, could either be noisy, look terrible, or just like I said... prematurely fail without you knowing.
That's why I said if AIO is the last resort for lower temps, that I do not agree at all. A very well optimized case for airflow and airblow is just as good as AIOs, if not better (the latter, if you bring cost into the discussion).


QUOTE(eatsleepnDIE @ Aug 5 2015, 04:09 PM)
» Click to show Spoiler - click again to hide... «


sir, i am a NOOB and my chassis is mitx type so i dont care about the internal looks  tongue.gif

errr...i thought it is normal to turn on the heater during winter? because outside is cold and you want the warmth inside the house. Well yeah the wishing for the snow to cool the rig are plain stupid so I apologise for that lol.

I dont know about the AIO dying prematurely, never occurred to me thank god for that but I still favored the AIO coz the simpleness and troubled free (again this may varied).

Again, this is my current opinion, maybe will change later  tongue.gif
*
Keep those fingers crossed. I'm not trying to hate on AIOs, but when I open up and see how low quality the stuff are in there, it cringes me when people go around and say AIOs are awesome. They didn't even know what they paid for. AIO isn't even leak-proof to begin with, yet it is being marketed as being so...?

Good that your AIO has not died. But just keep an eye on it when you start hearing the pump squeel. Have seen several systems died because of a failed AIO and the system was left running at very high temps for a long period of time.
Loki[D.d.G]
post Aug 5 2015, 06:18 PM

Quis custodiet ipsos custodes
*******
Senior Member
3,648 posts

Joined: Sep 2009
From: Twixt nether and ether
QUOTE(SSJBen @ Aug 5 2015, 03:55 PM)
The house is actually dryer and hotter with the heaters than in Malaysia where you'd just have the fan on instead. Ironic? Yeah.
*
It's almost always dryer in temperate than it is in Malaysia, with or without the AC/heater on. The humidity here is all but unbearable at times...

QUOTE(SSJBen @ Aug 5 2015, 06:03 PM)
There are 3 main reasons for water, temps obviously, looks and noise.
*
The noise part is situational. It'll certainly generate less noise at full or heavy loads for high TDP cards. But take for instance my Gigabyte GTX 970 G1 Gaming with its ridiculously powerful 600W air cooler which is virtually silent over the sound of my already quiet Noctua case fans even when loaded.
Moogle Stiltzkin
post Aug 5 2015, 07:07 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(SSJBen @ Aug 5 2015, 06:03 PM)
There are 3 main reasons for water, temps obviously, looks and noise.

I personally don't like AIOs at all. They're inconsistent, could either be noisy, look terrible, or just like I said... prematurely fail without you knowing.
That's why I said if AIO is the last resort for lower temps, that I do not agree at all. A very well optimized case for airflow and airblow is just as good as AIOs, if not better (the latter, if you bring cost into the discussion).
Keep those fingers crossed. I'm not trying to hate on AIOs, but when I open up and see how low quality the stuff are in there, it cringes me when people go around and say AIOs are awesome. They didn't even know what they paid for. AIO isn't even leak-proof to begin with, yet it is being marketed as being so...?

Good that your AIO has not died. But just keep an eye on it when you start hearing the pump squeel. Have seen several systems died because of a failed AIO and the system was left running at very high temps for a long period of time.
*
i stay away from AIO's

tbh i never tried them before, but i never heard good things about them sad.gif

besides my Swiftech MCP 655 pump is very reliable.... thats why
user posted image
https://martinsliquidlab.wordpress.com/2012...ve-pump-review/



Anyway regarding waterblocks apparently the one meant for titan x also works for the 980ti for the EKWB branded one. So some waterblocks will work with other gpu.... but i wouldn't hold my breath if it would on pascal. Moment pascal comes out, my 680 water block is as good as an expensive paper weight doh.gif




For newbies by the way a AIO water cooling is kinda like this. but in this instance the loop is strictly only for the gpu
user posted image

QUOTE
At the top of the pack are a pair of water-cooled cards. The GTX 980 Ti HYBRID uses a closed loop water cooler with a 120 mm radiator and retains the cooling fan to keep the rest of the card cool, similar to the solution used for AMD's R9 295x2. EVGA claimed the hybrid cooler significantly lowers GPU temperatures.

Also announced is a Hydro Copper edition with dedicated water block that includes G1/4 threads for use with standard water cooling fittings.





eatsleepnDIE
post Aug 5 2015, 07:11 PM

Getting Started
**
Junior Member
168 posts

Joined: Nov 2007


QUOTE(SSJBen @ Aug 5 2015, 06:03 PM)
There are 3 main reasons for water, temps obviously, looks and noise.

I personally don't like AIOs at all. They're inconsistent, could either be noisy, look terrible, or just like I said... prematurely fail without you knowing.
That's why I said if AIO is the last resort for lower temps, that I do not agree at all. A very well optimized case for airflow and airblow is just as good as AIOs, if not better (the latter, if you bring cost into the discussion).
Keep those fingers crossed. I'm not trying to hate on AIOs, but when I open up and see how low quality the stuff are in there, it cringes me when people go around and say AIOs are awesome. They didn't even know what they paid for. AIO isn't even leak-proof to begin with, yet it is being marketed as being so...?

Good that your AIO has not died. But just keep an eye on it when you start hearing the pump squeel. Have seen several systems died because of a failed AIO and the system was left running at very high temps for a long period of time.
*
yeah, thanks for the advise. i do hope it will last longer. will keep my ears wide and open to hear any unusual noise nod.gif
TSskylinelover
post Aug 5 2015, 09:06 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(Moogle Stiltzkin @ Aug 5 2015, 04:09 PM)
weird how nvidia gpu is performing different on linux and windows
http://www.phoronix.com/scan.php?page=arti...n10-linux&num=2
*short story

play game only on windows tongue.gif lel
i'd spend my buck on pc, gpu, hdds, etc etc.... water cooling really is the last resort when you run out of things to upgrade. that and mkb  laugh.gif
*
Haha 4 me i will spend my entire money 2 the gf when upgrading cycle finish laugh.gif unfortunately no gf want 2 take my money doh.gif even though i truly believe in "ada wang ada amoi" which doesnt work at all 2 my situation dang

This post has been edited by skylinelover: Aug 5 2015, 09:08 PM
SSJBen
post Aug 6 2015, 02:10 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Moogle Stiltzkin @ Aug 5 2015, 07:07 PM)
Anyway regarding waterblocks apparently the one meant for titan x also works for the 980ti for the EKWB branded one. So some waterblocks will work with other gpu.... but i wouldn't hold my breath if it would on pascal. Moment pascal comes out, my 680 water block is as good as an expensive paper weight  doh.gif
*
Zero chances of any existing blocks working for Pascal. The fact that the PCBs will definitely be smaller and also the GPU + HBM dies being overall bigger, there's just no way for any existing block to be compatible for Pascal.

Yeah, one problem with GPU waterblocks is that it isn't compatible forever. More so in Malaysia where the WC community is quite puny, selling it off is also quite difficult. US and EU has a way larger market where even blocks back from the Fermi days can still be sold relatively easily.
Moogle Stiltzkin
post Aug 6 2015, 02:22 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(SSJBen @ Aug 6 2015, 02:10 PM)
Zero chances of any existing blocks working for Pascal. The fact that the PCBs will definitely be smaller and also the GPU + HBM dies being overall bigger, there's just no way for any existing block to be compatible for Pascal.

Yeah, one problem with GPU waterblocks is that it isn't compatible forever. More so in Malaysia where the WC community is quite puny, selling it off is also quite difficult. US and EU has a way larger market where even blocks back from the Fermi days can still be sold relatively easily.
*
guess your right lookin at this picture hmm.gif
user posted image


to digress, apparently skylake cpu is
QUOTE
This new LGA 1151 socket is backwards compatible with most air and water CPU coolers which is certainly welcomed.
http://www.hardocp.com/article/2015/08/05/...ocking_review/2


by the way i tried to get my pc serviced at apes for water cooling. they helped me set it up the first time.

but on my second trip they held my pc for 2 weeks and did nothing with it. so i collected it back and nothing was done, was disappointing. you know any alternative places i can seek help sad.gif

is a bit related because when pascal comes out, i'm gonna need help to fit in a new water block for gpu rclxub.gif



random pic 980ti asus poseidon water cooled
user posted image


water cooling options for pascal... either i get like a poseidon asus type, or a specialist block sold separately like from ekwb doh.gif whichever cheapest. preferably copper this time around (instead of nickle/acetal i'm using atm sweat.gif )

This post has been edited by Moogle Stiltzkin: Aug 6 2015, 02:29 PM
terradrive
post Aug 6 2015, 03:14 PM

RRAAAWWRRRRR
******
Senior Member
1,943 posts

Joined: Apr 2005


Or just stay with air cooling and turn the aircon to the max lol
SUScrash123
post Aug 6 2015, 04:19 PM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
» Click to show Spoiler - click again to hide... «

agree with u..but some people like me, wanna tidy/pretty rig..and high performance cooler like nh-d15 will block the ram..even though air cooler performance per noise is great compare to CLC..
so the last resort water cooling solution sweat.gif
TSskylinelover
post Aug 6 2015, 08:15 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
Haha. Devils canyon FTW. Skylake did not impress me. laugh.gif doh.gif
SSJBen
post Aug 6 2015, 09:37 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(skylinelover @ Aug 6 2015, 08:15 PM)
Haha. Devils canyon FTW. Skylake did not impress me. laugh.gif doh.gif
*
Actually... there's more to the story than just avg fps and 10% higher numbers.
In a nut shell (since this isn't an Intel thread), Skylake > Devils Canyon/Haswell by a noticeable margin when it comes to min.fps and most importantly, frame latency.

It's even more significant if comparing Skylake to SB/IVB though. So... yeah, don't just look at the avg.fps and draw the conclusion.


*PS
Where's the Intel thread anyway?


*EDIT
For reference only:




See how frame hitching on i5 6600k is as good as an i7 4790k at lower clockspeeds? OC that thing mildly and wham! It's a clear noticeable difference!

This post has been edited by SSJBen: Aug 6 2015, 09:46 PM
Moogle Stiltzkin
post Aug 6 2015, 09:53 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(SSJBen @ Aug 6 2015, 09:37 PM)
Actually... there's more to the story than just avg fps and 10% higher numbers.
In a nut shell (since this isn't an Intel thread), Skylake > Devils Canyon/Haswell by a noticeable margin when it comes to min.fps and most importantly, frame latency.

It's even more significant if comparing Skylake to SB/IVB though. So... yeah, don't just look at the avg.fps and draw the conclusion.
*PS
Where's the Intel thread anyway?
*EDIT
For reference only:


See how frame hitching on i5 6600k is as good as an i7 4790k at lower clockspeeds? OC that thing mildly and wham! It's a clear noticeable difference!
*
ooo so the 20fps increase isn't really the full story ? ok i'll check that out ty.


*update

found a newer video



This post has been edited by Moogle Stiltzkin: Aug 6 2015, 10:00 PM
SUScrash123
post Aug 7 2015, 04:57 PM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
» Click to show Spoiler - click again to hide... «

Hurm, not worth uprgrade, need to buy new mobo,cpu and ddr4. I think I will skip Pascal as well. Wait until HBM matured 1st. Maybe build new rig when pcie 4.0 release
Moogle Stiltzkin
post Aug 7 2015, 05:42 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(crash123 @ Aug 7 2015, 04:57 PM)
» Click to show Spoiler - click again to hide... «

Hurm, not worth uprgrade, need to buy new mobo,cpu and ddr4. I think I will skip Pascal as well. Wait until HBM matured 1st. Maybe build new rig when pcie 4.0 release
*
well amd's fiji was using hbm1 which had a critical flaw. it was maxed at 4gb vram compared to current offerings like 980ti that had 6gb of gddr5 vram.

that said it was few titles that pushed vram usage, and even then most of the time it was for 1440p or higher resolutions.

but still it was a limitation.

hbm2 which is what pascal will be using will be either 8,16 or even 32gb. Either one will still be amazingly ample enough for just about anyone smile.gif

also bandwidth will be 1tb vs the current 500 ish or so for hbm1.

So in my book i think that is more than mature worth me buying :} so i wonder, what improvement to it do you think is worth waiting for that on book seems more than sufficient to last for a long while hmm.gif more vram ? seems like even 8gb is good enuff. faster bandwidth speed ? Sure but by how much and is it really worth waiting on that ?


people also mentioned how 980 quickly is going to become obsolete by a large margin due to many factors pointed out about pascal's tech e.g.
1. hbm2 (smaller gpu cards, more memory bandwidth and bigger vram capacities)
2. nvlink (probably going to benefit multi gpu setups)
3. double the transistor count of a titan x and fiji amd (better gpu performance ?)
4. mixed precision (probably less of an affect for gamers)

basically i think this is going to be one of those times there is a giant leap in performance. volta was delayed for quite a long time and is scheduled for after pascal maybe 1 or 2 years after. i'm placing my bet the leap from pascal to volta won't be too huge.

This post has been edited by Moogle Stiltzkin: Aug 7 2015, 06:05 PM
SSJBen
post Aug 7 2015, 06:05 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


I believe Pascal to Volta will be like Kepler to Maxwell. More improvements in performance-per-watt instead of just pure increase in rendering power.

But understand that in 3 years time, the landscape of developing games will be different than how it is now. Just like 3 years ago (and also the 3 years prior), developing games was different than it is today.

On the side note, Skylake is a sensible upgrade for SB users, not saying that SB isn't fast enough anymore because it still is. But I do believe that games in the next year+ will indeed start to be CPU-bound for SB users, it's already starting to actually. It's not going to be a large number of games that does that, but don't be surprised is all I'm saying.

Once again I'd like to remind everyone that you HAVE TO look at frame latency when factoring the performance difference between GPUs and CPUs. Avg. FPS simply DOES NOT tell the whole story. Avg.60fps is not really 60fps if you get erratic drops to low 40s for like half a second every 3 or 4 minutes of game time. People seem to forget this point.
SUScrash123
post Aug 7 2015, 07:43 PM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
» Click to show Spoiler - click again to hide... «

Ya i kinda agree this hbm thing is really future tech. If u see benchmark between gtx 980ti/titan sli vs fury x cf, u will see fury x is more efficient in cf
I will wait until they refresh the skylake and pascal then i will consider to totally upgrade my rig

Fury x 4GB almost beat 980ti 6GB. I wonder what if Fury x 6GB will do sweat.gif Gonna save my money for future rig

Pascal refresh, PCIEx16 Gen 4, Skylake refresh, cheap DDR 4 and NVMe. Thats is the year I will upgrade the rig rclxm9.gif

This post has been edited by crash123: Aug 7 2015, 07:43 PM
Moogle Stiltzkin
post Aug 7 2015, 08:09 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(crash123 @ Aug 7 2015, 07:43 PM)
» Click to show Spoiler - click again to hide... «

Ya i kinda agree this hbm thing is really future tech. If u see benchmark between gtx 980ti/titan sli vs fury x cf, u will see fury x is more efficient in cf
I will wait until they refresh the skylake and pascal then i will consider to totally upgrade my rig

Fury x 4GB almost beat 980ti 6GB. I wonder what if Fury x 6GB will do  sweat.gif Gonna save my money for future rig

Pascal refresh, PCIEx16 Gen 4, Skylake refresh, cheap DDR 4 and NVMe. Thats is the year I will upgrade the rig  rclxm9.gif
*
ddr4 cheap ? hm... last i checked it wasn't. maybe it's changed ? but that said there are motherboards that will support ddr3 so can keep using old ram which is sufficiently good even for gaming.

i'm not really excited about pc ram unless it's hmc or xpoint hmm.gif

as for pciex16 even my older ivy bridge mobo had that
QUOTE
2 x PCIe 3.0 x16
1 x PCIe 2.0 x16
2 x PCIe 2.0 x1
2 x PCI


user posted image

for me i use 1 pcie 3.0 x16 for my single gpu. and i guess the other can be used a pcie ssd (then again isn't m.sata ssd better ?)

anyway guess those multi gpu setups would need those extra pcie 3.0 slots doh.gif for me think is sufficient with what i got atm.

SUScrash123
post Aug 7 2015, 08:50 PM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
QUOTE(Moogle Stiltzkin @ Aug 7 2015, 08:09 PM)
ddr4 cheap ? hm... last i checked it wasn't. maybe it's changed ? but that said there are motherboards that will support ddr3 so can keep using old ram which is sufficiently good even for gaming.

i'm not really excited about pc ram unless it's hmc or xpoint  hmm.gif

as for pciex16 even my older ivy bridge mobo had that
user posted image

for me i use 1 pcie 3.0 x16 for my single gpu. and i guess the other can be used a pcie ssd (then again isn't m.sata ssd better ?)

anyway guess those multi gpu setups would need those extra pcie 3.0 slots doh.gif for me think is sufficient with what i got atm.
*
lol u dont get the point that i want to make. I say, when Pascal refresh, PCIEx16 Gen 4, Skylake refresh, cheap DDR 4 and NVMe is release, that is the year i will upgrade my rig. Btw PCIEx16 Gen 4 is release next year or 2017

PCIE x16 Gen 3.0 = 8 GT/s bit rate, doubling the lane bandwidth compare to PCI Express 2.0
PCIE x16 Gen 4.0 = 16 GT/s bit rate , doubling the lane bandwidth compare to PCI Express 3.0

so in Gen4 u can tri sli and got same bandwith as PCIE x16 3.0 drool.gif
Moogle Stiltzkin
post Aug 7 2015, 09:27 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(crash123 @ Aug 7 2015, 08:50 PM)
lol u dont get the point that i want to make. I say, when Pascal refresh, PCIEx16 Gen 4, Skylake refresh, cheap DDR 4 and NVMe is release, that is the year i will upgrade my rig. Btw PCIEx16 Gen 4 is release next year or 2017

PCIE x16 Gen 3.0 = 8 GT/s bit rate, doubling the lane bandwidth compare to PCI Express 2.0
PCIE x16 Gen 4.0 = 16 GT/s bit rate , doubling the lane bandwidth compare to PCI Express 3.0

so in Gen4 u can tri sli and got same bandwith as PCIE x16 3.0  drool.gif
*
gen4 ? oo didn't notice that doh.gif

so how much gt/s does the current 980ti pump out atm doh.gif ?

oh nm found it for titanx
http://www.tomshardware.co.uk/nvidia-gefor...view-33214.html

Texture Fillrate 192 GT/s

memory transfer rate 7 GT/s


so even gen 4 isn't going to be enough ?

QUOTE
Basically, NVLink provides a bigger pipe between the GPU and the CPU, and therefore, a much bigger data path, at least by today’s (or the immediate future’s) standards. For example, PCIe 3.0 transfers data at an impressive 8 gigatransfers per second (GTs), while Nvidia’s NVLink is expected to move data at about 20GTs, which is over twice as fast. 


so that nvlink 20 gt/s is needed even for single gpus ? hmm.gif

This post has been edited by Moogle Stiltzkin: Aug 7 2015, 09:29 PM
SUScrash123
post Aug 7 2015, 09:44 PM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
QUOTE(Moogle Stiltzkin @ Aug 7 2015, 09:27 PM)
gen4 ? oo didn't notice that doh.gif

so how much gt/s does the current 980ti pump out atm doh.gif ?

oh nm found it for titanx
http://www.tomshardware.co.uk/nvidia-gefor...view-33214.html

Texture Fillrate 192 GT/s

memory transfer rate 7 GT/s
so even gen 4 isn't going to be enough ?
so that nvlink 20 gt/s is needed even for single gpus ? hmm.gif
*
Faster transfer rate is more to multi gpu user. For single gpu user its will not give to much benefit in performance

read this article to learn more thumbup.gif

This post has been edited by crash123: Aug 7 2015, 09:44 PM
Moogle Stiltzkin
post Aug 7 2015, 10:19 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(crash123 @ Aug 7 2015, 09:44 PM)
Faster transfer rate is more to multi gpu user. For single gpu user its will not give to much benefit in performance

read this article to learn more  thumbup.gif
*
will do tx notworthy.gif
TSskylinelover
post Aug 8 2015, 05:40 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
Lolz. Here since when bcum skylake toks. laugh.gif doh.gif Anyway my next cycle of upgrading will be 4k gaming. That means 2017 and beyond. laugh.gif rclxms.gif Monitor gpu and cpu all in 1 loot. Hahahaha.

This post has been edited by skylinelover: Aug 8 2015, 05:41 PM
Moogle Stiltzkin
post Aug 8 2015, 06:55 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
some blockbuster titles based on nvidia gameworks



Minecrafter
post Aug 8 2015, 10:17 PM

ROCK N ROLL STAR
*******
Senior Member
5,043 posts

Joined: Aug 2013
From: Putrajaya


QUOTE(Moogle Stiltzkin @ Aug 8 2015, 06:55 PM)
some blockbuster titles based on nvidia gameworks

*
I'll rage quit if EA use Gameworks for FIFA...
SSJBen
post Aug 9 2015, 02:03 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(skylinelover @ Aug 8 2015, 05:40 PM)
Lolz. Here since when bcum skylake toks. laugh.gif doh.gif Anyway my next cycle of upgrading will be 4k gaming. That means 2017 and beyond. laugh.gif rclxms.gif Monitor gpu and cpu all in 1 loot. Hahahaha.
*
MYR 5.0 to USD1 that time... whistling.gif
Moogle Stiltzkin
post Aug 9 2015, 02:14 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(Minecrafter @ Aug 8 2015, 10:17 PM)
I'll rage quit if EA use Gameworks for FIFA...
*
well i just recently got...... *cough lords of the fallen which is a heavily gameworks developed game.

Seemed pretty decent graphics wise hmm.gif

Though oddly i had an issue on windows 10 x64 that i couldn't play it unless i set physx to cpu only in nvidia panel settings. otherwise it would crash on game save load.

anyways the game will randomly crash. seems not to be too stable on windows 10 hmm.gif



QUOTE(SSJBen @ Aug 9 2015, 02:03 PM)
MYR 5.0 to USD1 that time... whistling.gif
*


if someone could edit that song to say have to buy amd card instead doh.gif lelz...

This post has been edited by Moogle Stiltzkin: Aug 9 2015, 02:17 PM
Moogle Stiltzkin
post Aug 9 2015, 02:56 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
tweakguide updated their nvidia graphics settings guide here
http://www.tweakguides.com/NVFORCE_1.html


very useful tips like
QUOTE
For example, if you enable MFAA in the NVCP, then select 2x MSAA in a game, you will get the equivalent of 4x MSAA quality without any extra drop in performance; set 4x MSAA in the game and MFAA will convert it to 8x MSAA quality for free, and so on.


QUOTE
I recommend that Power Management Mode be set to Adaptive under Global Settings. For any games for which you believe your GPU is constantly downclocking, you can change this setting to Prefer Maximum Performance under the Program Settings tab to ensure the highest possible clock rates at all times. Remember that this setting only relates to games and other 3D applications, not to 2D applications or the Windows Desktop. Note also that if you run a multi-monitor and/or high refresh rate display your idle clocks may be slightly higher regardless of this setting, which is normal.


QUOTE
It is recommended that Texture Filtering - Quality be set to High Quality on medium and high-end systems, and High Performance on low-end systems under Global Settings. For particular games where you have performance to spare, you can select High Quality, and for those which are more strenuous, you can select High Performance under Program Settings as required. I can see no real reason to bother with using the Performance or Quality options for this setting, given the performance and image quality difference is extremely small even at the extremes of High Quality and High Performance. It's best just to use High Quality if you prefer the highest image quality, or High Performance if you prefer a potential performance boost. Additionally, there's no need to adjust the Texture Filtering - Anisotropic Sample Optimization and Texture Filtering - Trilinear Optimization settings separately; use this setting as your primary control over texture filtering and allow those to be adjusted automatically by this setting.
QUOTE
There is no simple solution when it comes to VSync. Whether you enable it, disable it, use Adaptive VSync, or set an FPS limit, there are always some compromises involved. The only no-compromises solution is to purchase a G-Sync capable monitor, which is worth considering the next time you want to replace your display.


smile.gif


by the way just wondering, for AF do you all do global ? cause seems like then i'd have to set manually for the game disable af if it has in options hmm.gif so isn't just simpler leave 3d app, and only if the game doesn't have the option then manually set for that app ?

This post has been edited by Moogle Stiltzkin: Aug 9 2015, 05:50 PM
SSJBen
post Aug 9 2015, 10:30 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Moogle Stiltzkin @ Aug 9 2015, 02:56 PM)
tweakguide updated their nvidia graphics settings guide here
http://www.tweakguides.com/NVFORCE_1.html
very useful tips like
smile.gif
by the way just wondering, for AF do you all do global ? cause seems like then i'd have to set manually for the game disable af if it has in options  hmm.gif  so isn't just simpler leave 3d app, and only if the game doesn't have the option then manually set for that app ?
*
Not all games uses AF and it doesn't just apply to older games stuck with Trilinear or Bilinear filtering.

Recent games like Witcher 3 did not use AF on release day and it was only secretly added into the config text file after patch 1.04. It wasn't even working until 1.05 and there is no way to manually change it (as changing the values does nothing).

So yeah, I think it is best to set AF to its default Quality setting for global and then individually overide it with High Quality on games that do have AF support.
kmarc
post Aug 10 2015, 09:25 AM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



After contemplating between a 390x and GTX980, finally decided to go for the GTX980. Granted that 390x is way cheaper and almost on par with 980 but I decided for the 980 due to lower power consumption and less heat. GTX980's overclocking headroom is also better but I don't normally overclock my parts nowadays except for testing..... old already.... biggrin.gif

Nice to be able to easily plop in the card, replacing my MSI GTX760, with the same 8-pin + 6-pin PCI-E connectors. Also great to see that the power consumption is about 190w (based on reviews) as compared to 165w on the old MSI.

Luckily my coolermaster 690 casing was able to accommodate the card, only with less than 2cm to spare!

And so, my first high-end card after using mid-end cards all my life!!!! (AMD 9600 pro, 7800 gs, 8800gts, 8800gt, GTX260, GTX460)

Now thinking whether I should go 1440p sweat.gif (gaming on 1080p all my life too!)
rav3n82
post Aug 10 2015, 09:38 AM

I find your lack of faith disturbing!
*******
Senior Member
7,084 posts

Joined: Feb 2011
From: Penang


QUOTE(kmarc @ Aug 10 2015, 09:25 AM)
» Click to show Spoiler - click again to hide... «

*
Nice. smile.gif What brand 980 you got?
kmarc
post Aug 10 2015, 09:44 AM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



QUOTE(rav3n82 @ Aug 10 2015, 09:38 AM)
Nice. smile.gif What brand 980 you got?
*
ASUS STRIX. I got it locally due to worries on postage. Damn expensive in Sarawak.

Attached Image

No pic of the card because I was so excited to install the card. Forgot to take pic..... sweat.gif



Moogle Stiltzkin
post Aug 10 2015, 10:06 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(kmarc @ Aug 10 2015, 09:25 AM)
After contemplating between a 390x and GTX980, finally decided to go for the GTX980. Granted that 390x is way cheaper and almost on par with 980 but I decided for the 980 due to lower power consumption and less heat. GTX980's overclocking headroom is also better but I don't normally overclock my parts nowadays except for testing..... old already....  biggrin.gif

Nice to be able to easily plop in the card, replacing my MSI GTX760, with the same 8-pin + 6-pin PCI-E connectors. Also great to see that the power consumption is about 190w (based on reviews) as compared to 165w on the old MSI.

Luckily my coolermaster 690 casing was able to accommodate the card, only with less than 2cm to spare!

And so, my first high-end card after using mid-end cards all my life!!!! (AMD 9600 pro, 7800 gs, 8800gts, 8800gt, GTX260, GTX460)

Now thinking whether I should go 1440p  sweat.gif (gaming on 1080p all my life too!)
*
tx notworthy.gif
kmarc
post Aug 10 2015, 12:02 PM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



QUOTE(Moogle Stiltzkin @ Aug 10 2015, 10:06 AM)
tx  notworthy.gif
*
Why notworthy.gif ? Too expensive is it? biggrin.gif

If I'm in KL could have bought/COD it from LYN forumers for RM500 cheaper!!!! vmad.gif mad.gif vmad.gif

This post has been edited by kmarc: Aug 10 2015, 12:02 PM
alfiejr
post Aug 10 2015, 12:22 PM

Gaming~
******
Senior Member
1,294 posts

Joined: Feb 2012
From: Taman Rasa Sayang, Cheras


QUOTE(kmarc @ Aug 10 2015, 09:44 AM)
ASUS STRIX. I got it locally due to worries on postage. Damn expensive in Sarawak.

Attached Image

No pic of the card because I was so excited to install the card. Forgot to take pic.....  sweat.gif
*
Nice you should go with gsync monitor if can. 😊
TSskylinelover
post Aug 10 2015, 12:30 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(kmarc @ Aug 10 2015, 09:25 AM)
After contemplating between a 390x and GTX980, finally decided to go for the GTX980. Granted that 390x is way cheaper and almost on par with 980 but I decided for the 980 due to lower power consumption and less heat. GTX980's overclocking headroom is also better but I don't normally overclock my parts nowadays except for testing..... old already....  biggrin.gif

Nice to be able to easily plop in the card, replacing my MSI GTX760, with the same 8-pin + 6-pin PCI-E connectors. Also great to see that the power consumption is about 190w (based on reviews) as compared to 165w on the old MSI.

Luckily my coolermaster 690 casing was able to accommodate the card, only with less than 2cm to spare!

And so, my first high-end card after using mid-end cards all my life!!!! (AMD 9600 pro, 7800 gs, 8800gts, 8800gt, GTX260, GTX460)

Now thinking whether I should go 1440p  sweat.gif (gaming on 1080p all my life too!)
*
QUOTE(alfiejr @ Aug 10 2015, 12:22 PM)
Nice you should go with gsync monitor if can. 😊
*
1440p gsync all the way laugh.gif rclxms.gif
B_p_r
post Aug 10 2015, 12:49 PM

New Member
*
Junior Member
33 posts

Joined: Apr 2011
From: Land Of The Horses


hello wanna ask,where can i get a low profile gtx750ti?any seller here?want to upgrade from my old amd gpu.Tq icon_rolleyes.gif
Moogle Stiltzkin
post Aug 10 2015, 01:50 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(kmarc @ Aug 10 2015, 12:02 PM)
Why  notworthy.gif ? Too expensive is it?  biggrin.gif

If I'm in KL could have bought/COD it from LYN forumers for RM500 cheaper!!!!  vmad.gif  mad.gif  vmad.gif
*
hm i replied wrong post. was referring to sjj laugh.gif

anyway 1440p fps ultra seems doable even on 980ti. probably more so with pascal so why not doh.gif

question is will you go for a acer predator gsync ips monitor :] ? 144hz gsync ips. rm2xxx

This post has been edited by Moogle Stiltzkin: Aug 10 2015, 01:51 PM
kmarc
post Aug 10 2015, 02:37 PM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



QUOTE(alfiejr @ Aug 10 2015, 12:22 PM)
Nice you should go with gsync monitor if can. 😊
*
QUOTE(skylinelover @ Aug 10 2015, 12:30 PM)
1440p gsync all the way laugh.gif rclxms.gif
*
QUOTE(Moogle Stiltzkin @ Aug 10 2015, 01:50 PM)
hm i replied wrong post. was referring to sjj  laugh.gif

anyway 1440p fps ultra seems doable even on 980ti. probably more so with pascal so why not doh.gif

question is will you go for a acer predator gsync ips monitor :] ? 144hz gsync ips. rm2xxx
*
OMG! 1440p my wife will already strangle me. 1440p gsync..... go straight 6-feet under! biggrin.gif

Actually, not willing to spend money for premium gsync tech at the moment.... hopefully next time when the prices comes down. smile.gif
PsychoHDxMachine
post Aug 10 2015, 04:38 PM

Getting Started
**
Junior Member
249 posts

Joined: Sep 2014
QUOTE(kmarc @ Aug 10 2015, 03:37 PM)
OMG! 1440p my wife will already strangle me. 1440p gsync..... go straight 6-feet under!  biggrin.gif

Actually, not willing to spend money for premium gsync tech at the moment.... hopefully next time when the prices comes down. smile.gif
*
"Kastam" too strict already icon_question.gif
Same case with you doh.gif
Moogle Stiltzkin
post Aug 10 2015, 04:58 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(kmarc @ Aug 10 2015, 02:37 PM)
OMG! 1440p my wife will already strangle me. 1440p gsync..... go straight 6-feet under!  biggrin.gif

Actually, not willing to spend money for premium gsync tech at the moment.... hopefully next time when the prices comes down. smile.gif
actually if you already have a nvidia gpu, then you've got your foot halfway into getting gsync capability.

hm... i think that monitor will be in the low 3k range now from what i now heard sad.gif


Anyway while your waiting for price to drop. i suspect the next thing to come out will be quantum dot film which they will layer on the panel. Will give better colors. Is the next thing over the horizon :}

This post has been edited by Moogle Stiltzkin: Aug 10 2015, 05:06 PM
TSskylinelover
post Aug 10 2015, 05:12 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(Moogle Stiltzkin @ Aug 10 2015, 04:58 PM)
actually if you already have a nvidia gpu, then you've got your foot halfway into getting gsync capability.

hm... i think that monitor will be in the low 3k range now from what i now heard  sad.gif
Anyway while your waiting for price to drop. i suspect the next thing to come out will be quantum dot film which they will layer on the panel. Will give better colors. Is the next thing over the horizon :}
*
Haha TN panel i dont like laugh.gif doh.gif after hopping in IPS
Moogle Stiltzkin
post Aug 10 2015, 06:56 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(skylinelover @ Aug 10 2015, 05:12 PM)
Haha TN panel i dont like laugh.gif doh.gif after hopping in IPS
*
i wonder if the latencies between this acer and a asus rog for example is that big a difference for a fps gamer hmm.gif i doubt it.

SUSTheHitman47
post Aug 10 2015, 07:44 PM

Nuke
******
Senior Member
1,053 posts

Joined: Sep 2009
From: In Your Mind



QUOTE(kmarc @ Aug 10 2015, 09:25 AM)
After contemplating between a 390x and GTX980, finally decided to go for the GTX980. Granted that 390x is way cheaper and almost on par with 980 but I decided for the 980 due to lower power consumption and less heat. GTX980's overclocking headroom is also better but I don't normally overclock my parts nowadays except for testing..... old already....  biggrin.gif

Nice to be able to easily plop in the card, replacing my MSI GTX760, with the same 8-pin + 6-pin PCI-E connectors. Also great to see that the power consumption is about 190w (based on reviews) as compared to 165w on the old MSI.

Luckily my coolermaster 690 casing was able to accommodate the card, only with less than 2cm to spare!

And so, my first high-end card after using mid-end cards all my life!!!! (AMD 9600 pro, 7800 gs, 8800gts, 8800gt, GTX260, GTX460)

Now thinking whether I should go 1440p  sweat.gif (gaming on 1080p all my life too!)
*
144fps is a way to go. laugh.gif
kmarc
post Aug 10 2015, 08:26 PM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



QUOTE(PsychoHDxMachine @ Aug 10 2015, 04:38 PM)
"Kastam" too strict already  icon_question.gif
Same case with you  doh.gif
*
"Kastam" can be bribed one...... brows.gif

QUOTE(Moogle Stiltzkin @ Aug 10 2015, 04:58 PM)
actually if you already have a nvidia gpu, then you've got your foot halfway into getting gsync capability.

hm... i think that monitor will be in the low 3k range now from what i now heard  sad.gif
Anyway while your waiting for price to drop. i suspect the next thing to come out will be quantum dot film which they will layer on the panel. Will give better colors. Is the next thing over the horizon :}
*
As mentioned, not willing to put the other foot in due to price. Definitely the itch is there but just have to scratch and scratch and scratch......

Anyway, going to survey and see what 1440p monitors are good. smile.gif

QUOTE(TheHitman47 @ Aug 10 2015, 07:44 PM)
144fps is  a way to go.  laugh.gif
*
144fps? shocking.gif Never had such frame rates unless it is 5-10 year old games!!!
clawhammer
post Aug 10 2015, 09:19 PM

///M
Group Icon
VIP
8,788 posts

Joined: Jan 2003
From: Kuala Lumpur




QUOTE(kmarc @ Aug 10 2015, 08:26 PM)
144fps? shocking.gif Never had such frame rates unless it is 5-10 year old games!!!
I tried BF4 on ASUS ROG Swift with my 980 Ti SLI, really can get 144Hz solid biggrin.gif

This post has been edited by clawhammer: Aug 10 2015, 09:19 PM
kmarc
post Aug 10 2015, 09:39 PM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



QUOTE(clawhammer @ Aug 10 2015, 09:19 PM)
I tried BF4 on ASUS ROG Swift with my 980 Ti SLI, really can get 144Hz solid biggrin.gif
*
I meant my usual mid-range cards can never achieve such high fps. Yours is super high-end.... of course can la... tongue.gif

This post has been edited by kmarc: Aug 10 2015, 09:39 PM
shikimori
post Aug 10 2015, 09:53 PM

Getting Started
**
Junior Member
236 posts

Joined: Jul 2007
From: Penang


seems like people preferred G-sync over freesync on a blindtest . Perhaps the limitation on the MG279q 35-90hz ruining the freesync experience

user posted image

http://www.tomshardware.com/reviews/amd-fr...event,4246.html

nice article to read if you are going for variable refresh rate monitor


Oh ya , btw do you guys know how long it is to get replacement GPU warranty claim for Asus ?

This post has been edited by shikimori: Aug 10 2015, 09:57 PM
SUSTheHitman47
post Aug 10 2015, 10:06 PM

Nuke
******
Senior Member
1,053 posts

Joined: Sep 2009
From: In Your Mind



QUOTE(kmarc @ Aug 10 2015, 08:26 PM)
"Kastam" can be bribed one......  brows.gif
As mentioned, not willing to put the other foot in due to price. Definitely the itch is there but just have to scratch and scratch and scratch......

Anyway, going to survey and see what 1440p monitors are good.  smile.gif
144fps? shocking.gif Never had such frame rates unless it is 5-10 year old games!!!
*
errr...need 5-10 years old games to play on 144hz monitor?.

This post has been edited by TheHitman47: Aug 10 2015, 10:07 PM
shikimori
post Aug 10 2015, 10:08 PM

Getting Started
**
Junior Member
236 posts

Joined: Jul 2007
From: Penang


QUOTE(kmarc @ Aug 10 2015, 08:26 PM)
"Kastam" can be bribed one......  brows.gif
As mentioned, not willing to put the other foot in due to price. Definitely the itch is there but just have to scratch and scratch and scratch......

Anyway, going to survey and see what 1440p monitors are good.  smile.gif
144fps? shocking.gif Never had such frame rates unless it is 5-10 year old games!!!
*
go for Acer XB270hu and pray you wont get dead pixels biggrin.gif
clawhammer
post Aug 10 2015, 10:31 PM

///M
Group Icon
VIP
8,788 posts

Joined: Jan 2003
From: Kuala Lumpur




QUOTE(kmarc @ Aug 10 2015, 09:39 PM)
I meant my usual mid-range cards can never achieve such high fps. Yours is super high-end.... of course can la...  tongue.gif
*
Sorry biggrin.gif However I can tell you the experience is certainly a huge difference compared to normal 60 Hz/FPS.
kmarc
post Aug 10 2015, 11:01 PM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



QUOTE(TheHitman47 @ Aug 10 2015, 10:06 PM)
errr...need 5-10 years old games to play on 144hz monitor?.
*
Haha, I meant older games less graphic requirements, can run at hundreds of fps.... but still limited to a 60Hz LCD.

QUOTE(shikimori @ Aug 10 2015, 10:08 PM)
go for Acer XB270hu and pray you wont get dead pixels biggrin.gif
*
Cool. Will do some research..... smile.gif

QUOTE(clawhammer @ Aug 10 2015, 10:31 PM)
Sorry biggrin.gif However I can tell you the experience is certainly a huge difference compared to normal 60 Hz/FPS.
*
I don't doubt that...... Still a lot of things yet to experience... 120/144hz gaming, 4K gaming, Gsync/FreeSync, Triple-monitor setup, etc.....

One day........
clawhammer
post Aug 10 2015, 11:03 PM

///M
Group Icon
VIP
8,788 posts

Joined: Jan 2003
From: Kuala Lumpur




QUOTE(kmarc @ Aug 10 2015, 11:01 PM)
One day........
*
The day is tomorrow biggrin.gif Anyway, computers are an expensive hobby.
TSskylinelover
post Aug 11 2015, 07:23 AM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(clawhammer @ Aug 10 2015, 11:03 PM)
The day is tomorrow biggrin.gif Anyway, computers are an expensive hobby.
*
Haha so true. laugh.gif rclxms.gif

Though i had xbox1 already but i still say PC all the way hahahaha
Moogle Stiltzkin
post Aug 11 2015, 09:52 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(shikimori @ Aug 10 2015, 09:53 PM)
seems like people preferred G-sync over freesync on a blindtest . Perhaps the limitation on the MG279q 35-90hz ruining the freesync experience

user posted image

http://www.tomshardware.com/reviews/amd-fr...event,4246.html

nice article to read if you are going for variable refresh rate monitor
Oh ya , btw do you guys know how long it is to get replacement GPU warranty claim for Asus ?
*
the only major difference is how it handles below it's minimum vrr window.

on paper gsync handles it all the way down to 0. whereas freesync starts having issues.

so the tests would have to focus on this area specifically hmm.gif

user posted image
QUOTE
What you're seeing here is a graphed output of how A-Sync and G-Sync behave with regards to refresh rates as framerate begins to drop to low levels.




fast forward to 50sec into the video and 6:30



freesync can achieve proper under vrr window that works just as well as nvidia. but that depends whether they will want to implement it or not. freesync has the potential to be just as good, but it's not there yet sad.gif

This post has been edited by Moogle Stiltzkin: Aug 11 2015, 10:09 AM
PsychoHDxMachine
post Aug 11 2015, 09:54 AM

Getting Started
**
Junior Member
249 posts

Joined: Sep 2014
QUOTE(kmarc @ Aug 10 2015, 10:39 PM)
I meant my usual mid-range cards can never achieve such high fps. Yours is super high-end.... of course can la...  tongue.gif
*
Yes. Beyond our expectations. Haha. 980ti sli cost at least 6k.

QUOTE(TheHitman47 @ Aug 10 2015, 11:06 PM)
errr...need 5-10 years old games to play on 144hz monitor?.
*
Like batman game if not mistaken. At least will go 3 digits fps.

QUOTE(clawhammer @ Aug 10 2015, 11:31 PM)
Sorry biggrin.gif However I can tell you the experience is certainly a huge difference compared to normal 60 Hz/FPS.
*
Yes... have to work hard ! Then only can afford to buy. Or rhb easy! ! Hahha
Najmods
post Aug 11 2015, 10:18 AM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(clawhammer @ Aug 10 2015, 11:03 PM)
The day is tomorrow biggrin.gif Anyway, computers are an expensive hobby.
*
No not really. Only when you demand certain things that isn't necessary (4K, 144Hz or fps in all games, 3D, multi monitor etc) then it become expensive.

Computer became quite stagnant these past few years. Midrange cards around RM700-900 can run games at 1920x1080 fluently while most budget cards around RM500 could reach that resolution with few setting turned down. Going 2nd hand with that budget could reach as high as 2560x1440 as 290/X cards which sold anywhere from RM750-1100 could handle that resolution.
TSskylinelover
post Aug 11 2015, 12:35 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(PsychoHDxMachine @ Aug 11 2015, 09:54 AM)
Yes. Beyond our expectations. Haha. 980ti sli cost at least 6k.
Like batman game if not mistaken. At least will go 3 digits fps.
Yes... have to work hard ! Then only can afford to buy. Or rhb easy! ! Hahha
*
And keep praying RM will not sink deeper than titanic laugh.gif doh.gif
TSskylinelover
post Aug 11 2015, 12:37 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(Najmods @ Aug 11 2015, 10:18 AM)
Computer became quite stagnant these past few years. Midrange cards around RM700-900 can run games at 1920x1080 fluently while most budget cards around RM500 could reach that resolution with few setting turned down. Going 2nd hand with that budget could reach as high as 2560x1440 as 290/X cards which sold anywhere from RM750-1100 could handle that resolution.
*
Haha so true laugh.gif rclxms.gif
clawhammer
post Aug 12 2015, 02:53 AM

///M
Group Icon
VIP
8,788 posts

Joined: Jan 2003
From: Kuala Lumpur




QUOTE(skylinelover @ Aug 11 2015, 07:23 AM)
Haha so true. laugh.gif rclxms.gif

Though i had xbox1 already but i still say PC all the way hahahaha
I used to love console gaming when I was young but ever since trying out PC's, I never looked back.


QUOTE(PsychoHDxMachine @ Aug 11 2015, 09:54 AM)
Yes... have to work hard ! Then only can afford to buy. Or rhb easy! ! Hahha
PC hardware are getting more and more expensive for us. 10 years back the highest end graphics card would only be 1K+ but these days it can go around 3K+ (980 Ti). Not to mention other parts like CPU, motherboard, etc.


QUOTE(Najmods @ Aug 11 2015, 10:18 AM)
No not really. Only when you demand certain things that isn't necessary (4K, 144Hz or fps in all games, 3D, multi monitor etc) then it become expensive.

Computer became quite stagnant these past few years. Midrange cards around RM700-900 can run games at 1920x1080 fluently while most budget cards around RM500 could reach that resolution with few setting turned down. Going 2nd hand with that budget could reach as high as 2560x1440 as 290/X cards which sold anywhere from RM750-1100 could handle that resolution.
*
A decent setup that allows you to enjoy proper gaming would cost you +/- 3K range. You can easily calculate (i5, 8GB RAM, board, mid end graphics card) and this does not include other components like SSD, LCD, keyboard.
My believe is that everyone deserves to enjoy life and this means enjoying their PC experiences as well. If someone doesn't bother much about technology, has little demands of what PC gaming means, turn down settings to medium or low just to play the game (even at 1080p - seriously in 2015?), use a HDD and make coffee while waiting for games to load, use a RM20 mouse/keyboard then yes, buy everything used and settle with 1080p like what you said. Buying used might be an option but old hardware could potentially cause problems more than fun and for someone that doesn't know how to deal with it, that's extra head ache.

I wouldn't even want to get close telling people to game at 1080p with "few setting turned down" because I myself could not accept it. To me, that's not PC gaming and might as well go buy a console. I'm not even talking about a high end build, a decent graphics card alone is RM700-900 like you mention and that itself is not expensive? rolleyes.gif Ok, I think that's subjective laugh.gif
Najmods
post Aug 12 2015, 06:19 AM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(clawhammer @ Aug 12 2015, 02:53 AM)
A decent setup that allows you to enjoy proper gaming would cost you +/- 3K range. You can easily calculate (i5, 8GB RAM, board, mid end graphics card) and this does not include other components like SSD, LCD, keyboard.
My believe is that everyone deserves to enjoy life and this means enjoying their PC experiences as well. If someone doesn't bother much about technology, has little demands of what PC gaming means, turn down settings to medium or low just to play the game (even at 1080p - seriously in 2015?), use a HDD and make coffee while waiting for games to load, use a RM20 mouse/keyboard then yes, buy everything used and settle with 1080p like what you said. Buying used might be an option but old hardware could potentially cause problems more than fun and for someone that doesn't know how to deal with it, that's extra head ache.

I wouldn't even want to get close telling people to game at 1080p with "few setting turned down" because I myself could not accept it. To me, that's not PC gaming and might as well go buy a console. I'm not even talking about a high end build, a decent graphics card alone is RM700-900 like you mention and that itself is not expensive? rolleyes.gif Ok, I think that's subjective laugh.gif
*
You implying 'I' and 'myself' in your description, like you say it's subjective. There is no right or wrong, the reason I mentioned the price is that is what most people do when recommending graphics card in this forum, saying something like 'top up a bit and get this'. Everyone have different requirement, different games they play, some enjoyed DOTA, FIFA which isn't that demanding, some enjoyed FPS, some enjoyed MMORPG. Some could play at 20-30fps, some can't. For the likes of you maybe RM3K is cheap, some even want to buy a whole rig with RM1k budget. I even sees someone play games reaching single digit in FPS games but still enjoying it because that's the rig they could afford.
shikimori
post Aug 12 2015, 06:56 AM

Getting Started
**
Junior Member
236 posts

Joined: Jul 2007
From: Penang


QUOTE(clawhammer @ Aug 12 2015, 02:53 AM)
I used to love console gaming when I was young but ever since trying out PC's, I never looked back.
PC hardware are getting more and more expensive for us. 10 years back the highest end graphics card would only be 1K+ but these days it can go around 3K+ (980 Ti). Not to mention other parts like CPU, motherboard, etc.
A decent setup that allows you to enjoy proper gaming would cost you +/- 3K range. You can easily calculate (i5, 8GB RAM, board, mid end graphics card) and this does not include other components like SSD, LCD, keyboard.
My believe is that everyone deserves to enjoy life and this means enjoying their PC experiences as well. If someone doesn't bother much about technology, has little demands of what PC gaming means, turn down settings to medium or low just to play the game (even at 1080p - seriously in 2015?), use a HDD and make coffee while waiting for games to load, use a RM20 mouse/keyboard then yes, buy everything used and settle with 1080p like what you said. Buying used might be an option but old hardware could potentially cause problems more than fun and for someone that doesn't know how to deal with it, that's extra head ache.

I wouldn't even want to get close telling people to game at 1080p with "few setting turned down" because I myself could not accept it. To me, that's not PC gaming and might as well go buy a console. I'm not even talking about a high end build, a decent graphics card alone is RM700-900 like you mention and that itself is not expensive? rolleyes.gif Ok, I think that's subjective laugh.gif
*
I couldnt agree more . Playing at low /medium or 1080p is a big nono . Once you have taste ultra setting at 1440p or 4k there is no going back . I rather read a book than play games at low 1080p settings but thats just me.
TSskylinelover
post Aug 12 2015, 07:45 AM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(clawhammer @ Aug 12 2015, 02:53 AM)
I used to love console gaming when I was young but ever since trying out PC's, I never looked back.
*
I terbalik. I was locked away from console buying even after begging my parents so many bloody times. So 4 the past 30 years, i have resisted many generation consoles, 6 gran turismos and 5 forza motorsports. sweat.gif rclxub.gif Even with project cars, i still find something missing in life. laugh.gif doh.gif so this month i finally chose 2 grab xbox1 and forza 6 the following month. I microsoft fanboy since xbox and forza 1 came out 10 years ago. laugh.gif rclxms.gif

End of the day, console 4 me is just 2 play that particular game. laugh.gif rclxms.gif nothing else. Haha.

This post has been edited by skylinelover: Aug 12 2015, 07:46 AM
clawhammer
post Aug 12 2015, 11:08 AM

///M
Group Icon
VIP
8,788 posts

Joined: Jan 2003
From: Kuala Lumpur




QUOTE(Najmods @ Aug 12 2015, 06:19 AM)
You implying 'I' and 'myself' in your description, like you say it's subjective.
LOL, I was referring to "a decent graphics card alone is RM700-900". I don't think that's cheap.

QUOTE
There is no right or wrong, the reason I mentioned the price is that is what most people do when recommending graphics card in this forum, saying something like 'top up a bit and get this'.
You can't just go by the forums and not everyone buys used item. A vast majority still shop for new hardware because if not, shops in Low Yat will close down and go bankrupt. You will be surprised how much the rental would cost and not to mention employee salaries.

QUOTE
Everyone have different requirement, different games they play, some enjoyed DOTA, FIFA which isn't that demanding, some enjoyed FPS, some enjoyed MMORPG. Some could play at 20-30fps, some can't.
Trust me, you start a poll or do a survey outside Low Yat plaza. I'm pretty sure not many would want to play at 20-30 FPS (Seriously, 20 FPS?! shocking.gif)

QUOTE
For the likes of you maybe RM3K is cheap, some even want to buy a whole rig with RM1k budget. I even sees someone play games reaching single digit in FPS games but still enjoying it because that's the rig they could afford.
I mention PC is an expensive hobby so I'm not sure when did I state that RM3K is cheap. There's no more RM1K budget rig around seriously. Get an i3, RAM, board and that is already close to RM1K. You mean the GPU, PSU, casing, HDD, keyboard, mouse, LCD can be bought for another RM200?

You certainly know very interesting people which can game with 9FPS or lower so I need to salute you for that.


QUOTE(shikimori @ Aug 12 2015, 06:56 AM)
I couldnt agree more . Playing at low /medium or 1080p is a big nono . Once you have taste ultra setting at 1440p or 4k there is no going back . I rather read a book than play games at low 1080p settings but thats just me.
*
1080p is a resolution of the past. If someone really wants to game 1080p then at least go for something like 144Hz, super ultra settings, triple monitor or some sort, haha.


QUOTE(skylinelover @ Aug 12 2015, 07:45 AM)
I terbalik. I was locked away from console buying even after begging my parents so many bloody times. So 4 the past 30 years, i have resisted many generation consoles, 6 gran turismos and 5 forza motorsports. sweat.gif rclxub.gif Even with project cars, i still find something missing in life. laugh.gif doh.gif so this month i finally chose 2 grab xbox1 and forza 6 the following month. I microsoft fanboy since xbox and forza 1 came out 10 years ago. laugh.gif rclxms.gif

End of the day, console 4 me is just 2 play that particular game. laugh.gif rclxms.gif nothing else. Haha.
*
I get what you mean and sometimes it's just about getting what we wanted all these while biggrin.gif

antaras
post Aug 12 2015, 11:23 AM

Getting Started
**
Junior Member
196 posts

Joined: Jan 2010
From: Kuala Lumpur


hmmm... I'm still gaming in 1080p 60hz monitor. That's as far as my card can go for me. However, I definitely would wanna try higher refresh rate mons. Then again, I'm not sure if I'm willing to turn off a few settings to do that though.
eatsleepnDIE
post Aug 12 2015, 12:17 PM

Getting Started
**
Junior Member
168 posts

Joined: Nov 2007


yeah me too. im still gaming on 1080p monitor but with 120hz refresh rate. trying to achieve that while playing the witcher 3 and failed miserably lol
TSskylinelover
post Aug 12 2015, 12:28 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
Woohoo. CGPA 4 ringgit equals 1 USD. At this rate, i think i rather jump 2 volta in 2017. It will need some sort of miracles 2 ever get back 2 zaman 2.8 equals a dollar. laugh.gif doh.gif
SUSTheHitman47
post Aug 12 2015, 01:35 PM

Nuke
******
Senior Member
1,053 posts

Joined: Sep 2009
From: In Your Mind



QUOTE(skylinelover @ Aug 12 2015, 12:28 PM)
Woohoo. CGPA 4 ringgit equals 1 USD. At this rate, i think i rather jump 2 volta in 2017. It will need some sort of miracles 2 ever get back 2 zaman 2.8 equals a dollar. laugh.gif doh.gif
*
i feels like it was last week our 2.8 = 1dollar. cry.gif

i dont know if i can still continue with my plan changing to itx.
SSJBen
post Aug 12 2015, 05:47 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(skylinelover @ Aug 12 2015, 12:28 PM)
Woohoo. CGPA 4 ringgit equals 1 USD. At this rate, i think i rather jump 2 volta in 2017. It will need some sort of miracles 2 ever get back 2 zaman 2.8 equals a dollar. laugh.gif doh.gif
*
Lol 2017? We should plan to refuge by then instead if nothing changes over the next year.
Najmods
post Aug 12 2015, 06:02 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(clawhammer @ Aug 12 2015, 11:08 AM)
Trust me, you start a poll or do a survey outside Low Yat plaza. I'm pretty sure not many would want to play at 20-30 FPS (Seriously, 20 FPS?! shocking.gif)

You certainly know very interesting people which can game with 9FPS or lower so I need to salute you for that.
*

Yep, I see my cousin play Red Alert 2 at merely 2-5fps with Pentium 233MHz MMX, but you'd be surprised how massive his base is. I don't know how he got the patience playing it. Even I once play at that speed as well but I can't handle it (played on Pentium 166MHz). From loading screen to build first power plant takes 15 minutes. No joke.

QUOTE(clawhammer @ Aug 12 2015, 11:08 AM)
I mention PC is an expensive hobby so I'm not sure when did I state that RM3K is cheap. There's no more RM1K budget rig around seriously. Get an i3, RAM, board and that is already close to RM1K. You mean the GPU, PSU, casing, HDD, keyboard, mouse, LCD can be bought for another RM200?
I merely judging from your rig, and the stuff you sold, I actually wanted to buy your CM PSU but maybe next month tongue.gif

Nope, been asked for that kind of budget for laptop for gaming, PC for gaming etc but I can't recommend them any since there isn't one fit for gaming sweat.gif At a minimum it's always at least RM1.5k, squeezing an APU could get at that budget, with cheap LCD and generic case/PSU/keyboard + mice combo.

When I see people asking for something like 'lower than RM500' then I saw someone posted 'top up RM100 or RM200 get these' I'm not really agreed with that unless the OP stated he can go that far, because some people even RM50 means a lot to them.
Moogle Stiltzkin
post Aug 12 2015, 08:52 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
i was thinking of getting pascal with waterblock. but looking at currency 1:4 to the usdollar, i may have to stick to stock fan cooler sad.gif

and if it gets even worse, i'd have to downgrade the gpu to the medium or lower spec version of the pascal variant shakehead.gif



This post has been edited by Moogle Stiltzkin: Aug 14 2015, 05:50 PM
TSskylinelover
post Aug 13 2015, 01:44 AM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(TheHitman47 @ Aug 12 2015, 01:35 PM)
i feels like it was last week our 2.8 = 1dollar.  cry.gif

i dont know if i can still continue with my plan changing to itx.
*
I mean american dollars laugh.gif

Think was 2 years back i buy hell a lot of goodies from ebay with 2.8 per american dollars including my GPU hahahaha rclxms.gif

You should abort paln and refuge elsewhere doh.gif

QUOTE(Moogle Stiltzkin @ Aug 12 2015, 08:52 PM)
i was thinking of getting pascal with waterblock. but looking at currency 1:4 to the usdollar, i may have to stick to stock fan cooler sad.gif

and if it gets even worse, i'd have to downgrade the gpu to the medium or lower spec version of the pascal variant  shakehead.gif


*
I feel your pain brah doh.gif doh.gif
Moogle Stiltzkin
post Aug 13 2015, 12:13 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
How much video memory is enough?
4GB versus the world
http://techreport.com/blog/28800/how-much-...emory-is-enough


bottomline
QUOTE
Of course, much of what we've just demonstrated about memory capacity constraints is kind of academic for reasons we've noted. On a practical level, these results match what we saw in our initial reviews of the R9 Fury and Fury X: at resolutions of 4K and below, cards with 4GB of video memory can generally get by just fine, even with relatively high image quality settings. Similarly, the GeForce GTX 970 seems to handle 4K gaming quite well in spite of its funky partitioned memory. Meanwhile, at higher resolutions, no current single-GPU graphics card is fast enough for fluid gaming, no matter how much memory it might have. Even with 12GB, the Titan X averages less than 30 FPS in Shadow of Mordor at 5760x3240.


QUOTE
The biggest concern, though, is future games that simply require more memory due to the use of higher-quality textures and other assets. AMD has a bit of a challenge to manage, and it will likely need to tune its driver software carefully during the Fury's lifetime in order to prevent occasional issues. Here's hoping that work is effective.


some games already 20gb + rclxub.gif so those flash drives need to start gaining more capacity on the cheaper before get too big to fit sweat.gif games on regular hdd is no longer good enough sad.gif

QUOTE
When or if I buy Wolfenstein: The New Order this year, it will probably be the physical PC version because I don’t want to have to download 50GB on my internet connection and monthly data cap. When I download a game on Steam my maximum download speed is around 1.5 Megabytes per second. Installing a 20GB game usually takes me six hours. If Call of Duty Ghosts — somewhere north of 40GB, shows up on a Steam free weekend, I’d have to spend 12 of those free hours downloading the game. On that note maybe EA’s Origin Game Time is a better take on the free trial idea since it doesn’t start your clock until you’ve actually installed the game.


QUOTE
Developers should use less pre-rendered FMVs for one thing. The PC version of Metal Gear Rising: Revengeance is nearly 25GB, but as I understand it around 18 of those gigs are consumed by FMVs. The actual game is somewhere around 5GB. The recent PC release of Final Fantasy XIII weighs in at 60GB, with FMVs accounting for 46GB.

FMVs made sense back in the 90’s when real-time video game graphics couldn’t display events in storylines as convincingly. Pre-rendered CG graphics are always a generation ahead of video game graphics, but I’d say real-time graphics have gotten good enough to fully convey storylines. Plus, they mesh better with the actual gameplay. FMVs have also only accelerated their file size increases with the move to encoding them in 1080p.

Somewhat odd are games that use FMVs that are pre-rendered with the same graphics as gameplay. Maybe things are rendered in those cut scenes that the gameplay engine can’t handle or doesn’t need, but I think developers should still try if it can make the difference in file sizes. Can some developers do a better job of compressing the video files if FMVs are unavoidable? Maybe some games should do a better job of storytelling that conveys more through gameplay and less through cut scenes and FMVs.


QUOTE
Then you have asset quality. Titanfall is 48GB on PC because it uses uncompressed audio which is easier on dual core processors. In addition to choice of languages, why don’t they give players a choice on whether uncompressed audio matters to them. The same goes for textures. Some pirate versions of Max Payne 3 come with only one audio language or with compressed textures to cut down on the download size. I wonder how many customers would be receptive to official distributors doing the same thing.


http://venturebeat.com/community/2014/10/1...ng-out-of-hand/



i don't agree that they should use less fmvs. i rather they just increase flash drive capacities more cost effectively to accomodate future gaming.

maybe something like intel's x-point will save the day icon_idea.gif


think the first product will be in 2016 hmm.gif
user posted image


This post has been edited by Moogle Stiltzkin: Aug 13 2015, 12:34 PM
SSJBen
post Aug 13 2015, 03:14 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


Witcher 3 had very, very, little FMVs. The game is only 30GB+ including the latest patches yet is one of the biggest games in recent history. Yet, it has some of the most terrible animation rigging in a AAA game, ever. doh.gif

FMVs would have helped Witcher 3 immensely to be honest.

CDPR does not have Fox Engine, just sayin'.
Moogle Stiltzkin
post Aug 13 2015, 07:07 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(SSJBen @ Aug 13 2015, 03:14 PM)
Witcher 3 had very, very, little FMVs. The game is only 30GB+ including the latest patches yet is one of the biggest games in recent history. Yet, it has some of the most terrible animation rigging in a AAA game, ever. doh.gif

FMVs would have helped Witcher 3 immensely to be honest.

CDPR does not have Fox Engine, just sayin'.
*
hm... probably still better than command and conquer generals. they just cut out fmvs altogether. cut cost on actors sad.gif

and sadly even the last tiberium series title was shit. kanes acting alone could not save the rest of poor fmvs to do his acting any justice doh.gif

in diablo 3 which i closed beta tested while back, they didnd't do fmv for everything. some scenes use ingame character to act out the scenes. another way to reduce fmvs.... cost cutting i guess. or maybe they didn't want scenes to be in parts of the game where it needlessly ruins the pacing of the game scenes hmm.gif

This post has been edited by Moogle Stiltzkin: Aug 13 2015, 07:27 PM
TSskylinelover
post Aug 13 2015, 07:22 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
Haha. I still think red alert 2 has the best FMV ever and nothing has beat it since yo. laugh.gif rclxms.gif
eatsleepnDIE
post Aug 13 2015, 11:19 PM

Getting Started
**
Junior Member
168 posts

Joined: Nov 2007


QUOTE(skylinelover @ Aug 13 2015, 07:22 PM)
Haha. I still think red alert 2 has the best FMV ever and nothing has beat it since yo. laugh.gif rclxms.gif
*
Seconded this..that russian babe was hot!
Moogle Stiltzkin
post Aug 14 2015, 04:11 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(skylinelover @ Aug 13 2015, 07:22 PM)
Haha. I still think red alert 2 has the best FMV ever and nothing has beat it since yo. laugh.gif rclxms.gif
*
best story narration definitely final fantasy 7. fmvs might not be up to todays standard, but come reboot who knows. but modern fmv + ff7 reboot = killer game icon_idea.gif
Moogle Stiltzkin
post Aug 18 2015, 12:17 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
the first dx12 game as far i know
Ashes of the Singularity
http://www.extremetech.com/gaming/212314-d...go-head-to-head


:/ is the game any fun ?


SSJBen
post Aug 18 2015, 10:47 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Moogle Stiltzkin @ Aug 18 2015, 12:17 PM)
the first dx12 game as far i know
Ashes of the Singularity
http://www.extremetech.com/gaming/212314-d...go-head-to-head
:/ is the game any fun ?
*
No, it's a boring as fudge RTS. It's also terribly unbalanced at the moment.
antaras
post Aug 19 2015, 10:56 AM

Getting Started
**
Junior Member
196 posts

Joined: Jan 2010
From: Kuala Lumpur


QUOTE(SSJBen @ Aug 18 2015, 10:47 PM)
No, it's a boring as fudge RTS. It's also terribly unbalanced at the moment.
*
Will need to see if new drivers going to improve the performance or not. Currently, AMD is kicking ass with DX12. Then again, it's only ONE benchmark. In anycase, still very interesting to see. rclxms.gif
SHOfrE3zE
post Aug 20 2015, 09:30 AM

Drop It Like It's Hot
******
Senior Member
1,895 posts

Joined: Jan 2003
From: Shah Alam


guys anyone having problem running Geforce Experience on Windows 10?
It was fine before but now it won't even launch and stuck on the loading screen when i execute the program.

Tried uninstall and install back but the problem persist.
Moogle Stiltzkin
post Aug 20 2015, 05:54 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE
Intel has, according to The Tech Report, decided to support Adaptive-Sync -- but not necessarily in their current product line. David Blythe of Intel would not comment on specific dates or release windows, just that it is in their plans. This makes sense for Intel because it allows their customers to push settings higher while maintaining a smooth experience, which matters a lot for users of integrated graphics.

While “AMD FreeSync” is a stack of technologies, VESA DisplayPort Adaptive-Sync should be all that is required on the monitor side. This should mean that Intel has access to all of AMD's adaptive refresh monitors, although the driver and GPU circuitry would need to be their burden. G-Sync monitors (at least those with NVIDIA-design modules -- this is currently all of them except for one laptop I think) would be off limits, though.




QUOTE
This is BIG. Intel recently bought Alterra the people who make the G-Sync module for Nvidia. They also cross-license a lot of Nvidia GPUs tech.

They would have more insight to the future viability of G-Sync than anyone aside from Nvidia themselves and to decide to go the AMD route.

THATS BIG!!!


QUOTE
That is a bit misleading. Alterra just makes FPGAs. They may not have any actual data on how Nvidia's g-sync module works unless Nvidia sent them the design for debugging or something. Using an FPGA is kind of like using a CPU that you then need to write software for. If I buy a CPU, the maker of that CPU doesn'tknow what software I run on it. FPGAs are programmed using a hardware description language like verilog rather than a software programming language. Alterra doesn't necessarily have access to the verilog that Nvidia uses to program the FPGA.

If Nvidia is confident that there will be a larger volume of g-sync modules sold then they can actually use the verilog design to create a fixed function ASIC. This should be much cheaper, if there is sufficient volume. I tried to find out the price of the FPGA Nvidia is using, and it looked like it was around $200 in small volumes, if I had the right one. Nvidia would get a better price for a large number of parts though. I don't know who takes the FPGA and mounts it on a board to make the actual g-sync module. Nvidia probably just contracts this out to some other company.



any thoughts on this ?

would it be possible for a gsync like performance but via the intel adaptive-sync plan ?
http://www.pcper.com/news/Graphics-Cards/I...e-Sync#comments

This post has been edited by Moogle Stiltzkin: Aug 20 2015, 05:57 PM
cstkl1
post Aug 20 2015, 07:54 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(Moogle Stiltzkin @ Aug 20 2015, 05:54 PM)
any thoughts on this ?

would it be possible for a gsync like performance but via the intel adaptive-sync plan ?
http://www.pcper.com/news/Graphics-Cards/I...e-Sync#comments
*
Third quote is correct.

Intel wont have the codes written unto the fpga.
N nvidia coding is different for each panel which itself has a qc line for rejection.
Come to think it.. Lol is freesync minitors mixed with the rejected panels that couldnt pass gsync test??


Desprado
post Aug 20 2015, 11:20 PM

Getting Started
**
Junior Member
258 posts

Joined: Feb 2012
Dam i bought MSI GTX 980 again.

It has 80.4 asiq quality and Elpida Vram

I am surprised that i am running this card @1545Mhz and memory clock 3900mhz (7800mhz).

I can even push more further the vram but it is Elpida so i am scared to that.
SSJBen
post Aug 21 2015, 12:16 AM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Desprado @ Aug 20 2015, 11:20 PM)
Dam i bought MSI GTX 980 again.

It has 80.4 asiq quality and Elpida Vram

I am surprised that i am running this card @1545Mhz and memory clock 3900mhz (7800mhz).

I can even push more further the vram but it is Elpida  so i am scared to that.
*
Anything over 7500 for Elpida is awesome. 7800 is icing on the cake already, awesome that you got it that high.
Minecrafter
post Aug 21 2015, 04:42 PM

ROCK N ROLL STAR
*******
Senior Member
5,043 posts

Joined: Aug 2013
From: Putrajaya


Zotac GTX950 AMP! performing almost as good as a stock clock speeds GTX960 on some games,and slightly better on some games. shocking.gif
chocobo7779
post Aug 21 2015, 04:52 PM

Power is nothing without control
********
All Stars
14,673 posts

Joined: Sep 2010
QUOTE(Minecrafter @ Aug 21 2015, 04:42 PM)
Zotac GTX950 AMP! performing almost as good as a stock clock speeds GTX960 on some games,and slightly better on some games. shocking.gif
*
Correct, but the pricing on the ASUS Strix is just downright wrong... sweat.gif

But one problem though - the Zotac retails for USD175, whereas a reference 960 costs USD190. sweat.gif

This post has been edited by chocobo7779: Aug 21 2015, 04:53 PM
Moogle Stiltzkin
post Aug 21 2015, 07:22 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
Samsung Enters The HBM Market In 1H 2016 – HPC and GPU Ready HBM With Up to 1.5 TB/s Bandwidth and 48 GB VRAM
QUOTE
Starting in 2016, the first markets that Samsung will focus towards will include the HPC and graphics department. Samsung has a wide array of HBM configurations pre-planned.

Each HBM stack will be made from a single 8Gb component and range down to several tiers of HBM SKUs. The entry level models include the 2-Hi DRAM model that will be integrated on mainstream 2 GB HBM graphics cards (256 GB/s), performance based graphics with 4 GB HBM (512 GB/s). The Enthusiast graphics cards will ship with 4-Hi DRAM with 2 HBM stacks that will allow 8 GB VRAM (512 GB/s) and finally, 4 HBM Stacks with 16 GB VRAM models (1 TB/s).

On the HPC front, there are a wide array of high bandwidth and dense memory designs that include 4-Hi DRAMs with 4 HBM stacks that feature 32 GB VRAM (1 TB/s) and the bulky, 8-Hi DRAMs configured in 6 HBM stacks with 24 GB and 48 GB VRAM, both models featuring 1.5 TB/s bandwidth. There are also some network oriented HBM SKUs which are planned for launch in 2017 with 8-Hi DRAM Stacks configured in 1-2 HBM chips. In 2018, Samsung wants to focus on increase market growth by entering new applications to incorporate their HBM designs.


http://wccftech.com/samsung-enters-hbm-mar...dth-48-gb-vram/

is it a stretch then to guess now what hbm capacity pascal will have ? hmm.gif

i think pascal said 1tb/s.... so doesn't that then mean 16gb vram ?? drool.gif

but will they be sourcing from samsung ? hmm.gif also is this same product lineup more or less same as others, meaning that 1tb/s will definitely be 16gb vram ?

This post has been edited by Moogle Stiltzkin: Aug 21 2015, 07:27 PM
Minecrafter
post Aug 21 2015, 07:27 PM

ROCK N ROLL STAR
*******
Senior Member
5,043 posts

Joined: Aug 2013
From: Putrajaya


QUOTE(chocobo7779 @ Aug 21 2015, 04:52 PM)
Correct, but the pricing on the ASUS Strix is just downright wrong... sweat.gif

But one problem though - the Zotac retails for USD175, whereas a reference 960 costs USD190.  sweat.gif
*
Y'know,it's Asus,what do you expect. tongue.gif

With USD,GBP and Euros,price difference is very small.With Malaysian Ringgit,a bit big.
TSskylinelover
post Aug 21 2015, 09:01 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(Minecrafter @ Aug 21 2015, 07:27 PM)
With USD,GBP and Euros,price difference is very small.With Malaysian Ringgit,a bit big.
*
A lot big baru betul doh.gif shakehead.gif

I talking about 200-300 difference not 20-30 difference dang yo
kahyeec
post Aug 21 2015, 10:16 PM

Regular
******
Senior Member
1,152 posts

Joined: Jul 2006


QUOTE(chocobo7779 @ Aug 21 2015, 04:52 PM)
Correct, but the pricing on the ASUS Strix is just downright wrong... sweat.gif

But one problem though - the Zotac retails for USD175, whereas a reference 960 costs USD190.  sweat.gif
*
Yup, price from Newegg is USD174 (170+4 shipping) , which will equal MYR724, plus 6% GST is MYR767 . Asus Malaysia price with free thumb drive is MYR879 , 112 ringgit difference.

Seems the GTX950 dont have free game . Kind of expected seeing it is a budget card. biggrin.gif

Designwise the EVGA one looks quite premium . Might get one from Amazon drool.gif

This post has been edited by kahyeec: Aug 21 2015, 10:18 PM
SSJBen
post Aug 21 2015, 11:58 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(kahyeec @ Aug 21 2015, 10:16 PM)
Yup, price from Newegg is USD174 (170+4 shipping) , which will equal MYR724, plus 6% GST is MYR767 . Asus Malaysia price with free thumb drive is MYR879 , 112 ringgit difference.

Seems the GTX950 dont have free game . Kind of expected seeing it is a budget card.  biggrin.gif

Designwise the EVGA one looks quite premium . Might get  one from Amazon  drool.gif
*
You all understand that prices on US webstores does not include state tax and shipping charges right? Different states, different tax percentage. USD174 in NY is actually $196, 13% tax.
kahyeec
post Aug 22 2015, 07:16 AM

Regular
******
Senior Member
1,152 posts

Joined: Jul 2006


QUOTE(SSJBen @ Aug 21 2015, 11:58 PM)
You all understand that prices on US webstores does not include state tax and shipping charges right? Different states, different tax percentage. USD174 in NY is actually $196, 13% tax.
*
That is true if we were to buy from US, then we will need to add around MYR100 for shipping , but we have Asus dealer here in Malaysia , and their base pricing (exclude shipping)on the same product is already higher than US market.
But i guess RRP or SRP is just for reference only, the shop always price it near to RRP when launching, prices will drop after a few months. I already saw Strix 980TI selling for 3199 here in Kuching liao , the suggested price at launch for it is 3399?
SUSTheHitman47
post Aug 22 2015, 12:37 PM

Nuke
******
Senior Member
1,053 posts

Joined: Sep 2009
From: In Your Mind



QUOTE(kahyeec @ Aug 21 2015, 10:16 PM)
Yup, price from Newegg is USD174 (170+4 shipping) , which will equal MYR724, plus 6% GST is MYR767 . Asus Malaysia price with free thumb drive is MYR879 , 112 ringgit difference.

Seems the GTX950 dont have free game . Kind of expected seeing it is a budget card.  biggrin.gif

Designwise the EVGA one looks quite premium . Might get  one from Amazon  drool.gif
*
could wait from idealtech. they seems to bring evga cards now.
TSskylinelover
post Aug 22 2015, 12:40 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(TheHitman47 @ Aug 22 2015, 12:37 PM)
could wait from idealtech. they seems to bring evga cards now.
*
And mark up 50% without gst yet laugh.gif doh.gif
SUSTheHitman47
post Aug 22 2015, 12:42 PM

Nuke
******
Senior Member
1,053 posts

Joined: Sep 2009
From: In Your Mind



QUOTE(skylinelover @ Aug 22 2015, 12:40 PM)
And mark up 50% without gst yet laugh.gif doh.gif
*
lol, true tho. but compare to the cost of posting to msia?. is it cheaper?.
TSskylinelover
post Aug 22 2015, 12:46 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(TheHitman47 @ Aug 22 2015, 12:42 PM)
lol, true tho. but compare to the cost of posting to msia?. is it cheaper?.
*
More ex actually haha dang

Shipping direct from amazon more cheaper most said
SUSTheHitman47
post Aug 22 2015, 12:54 PM

Nuke
******
Senior Member
1,053 posts

Joined: Sep 2009
From: In Your Mind



QUOTE(skylinelover @ Aug 22 2015, 12:46 PM)
More ex actually haha dang

Shipping direct from amazon more cheaper most said
*
whichever cheaper is better. thumbup.gif
eatsleepnDIE
post Aug 22 2015, 01:23 PM

Getting Started
**
Junior Member
168 posts

Joined: Nov 2007


Support local biz la tongue.gif
goldfries
post Aug 22 2015, 01:46 PM

40K Club
Group Icon
Forum Admin
44,415 posts

Joined: Jan 2003




Galax GTX 950 is at RM 859 but it's really beautiful.
kahyeec
post Aug 22 2015, 02:42 PM

Regular
******
Senior Member
1,152 posts

Joined: Jul 2006


QUOTE(TheHitman47 @ Aug 22 2015, 12:42 PM)
lol, true tho. but compare to the cost of posting to msia?. is it cheaper?.
*
Amazon Export Inc charges usd 26 to ship to Malaysia for EVGA 980TI , I bought one and it is just that, no gst, no import tax, no excise tax , so I got my 980TI SC ACX2.0 backplate for 2700 all in. wink.gif . Amazon use ARAMEX. To send.
kmarc
post Aug 22 2015, 06:34 PM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



QUOTE(kahyeec @ Aug 22 2015, 02:42 PM)
Amazon Export Inc charges usd 26 to ship to Malaysia for EVGA 980TI , I bought one and it is just that, no gst, no import tax, no excise tax , so I got my 980TI SC ACX2.0 backplate for 2700 all in. wink.gif . Amazon use ARAMEX. To send.
*
How about warranty claim?
kahyeec
post Aug 22 2015, 08:11 PM

Regular
******
Senior Member
1,152 posts

Joined: Jul 2006


QUOTE(kmarc @ Aug 22 2015, 06:34 PM)
How about warranty claim?
*
As it turns out , the 980ti that I ordered from Amazon is Faulty on arrival. This is what I did and i assume what you all will need to do in the Event of a EVGA RMA :
Its not too hard , and it is a good way to test their famous RMA policies:
1. Get a screen shot of your Amazon invoice.
2. Register the card on their website . They will ask for that invoice and the serial number of the card.
3. Open a Technical Assistance Ticket , in the form you will need to fill in the FUN stuff like showing off the spec of your high end rig and of course why you think your card is faulty.
Plenty of slots for photos , so you can show off your spec AND send them photos.
4. They will reply and ask you to do a few counter checks such as update bios , stick the card on another computer , slot the card in another slot bla bla bla.
5. you will reply them that you have done all those things and still the card is no go.
6. then you open a RMA ticket , they will approve your ticket if they think it is a faulty card.
7. Once you open a Technical Assist Ticket , everything is done via email , and if they approve your RMA Ticket , you will get a email saying that your RMA tiicket has been approved.
8. Send them your item bare naked , wraping it in bubble wrap and packing peanut in a box, they will tell you to send ONLY the card, you keep the packaging and posters and stickers and whatnot.
they will be sending you back a bare card only also. but in a better , more classier packing. The address and whatnot will be sent to you in the approval email.
9. You will pay for the postage to EVGA Taiwan.You can choose expensive one like FedEx , DHL or cheaper ones like Poslaju EMS. They will pay for the postage when they send you the replacement.
10. They use TNT as their favorite courier in Malaysia.
11. My case i argued that it was a brand new card and it is a DOA case, requested them to pay both ways. They agreed . I drop the card off TNT office the Thursday before Hari Raya.
12. The replacement was sent out on the Monday after Hari Raya and I receive my brand new replacement card on Wednesday.
13. Register your new card after a few days of playing with it to start the warranty.

Total length of RMA around 20 days , taking into account there is a slight delay because of the 4th of July celebrations in the US, the typoon in Taiwan and Hari Raya.
All in all I am satisfied with their RMA process.
So dont worry too much about RMA for EVGA cards.


kmarc
post Aug 22 2015, 08:46 PM

The future is here - Cryptocurrencies!
Group Icon
Elite
14,576 posts

Joined: May 2006
From: Sarawak



QUOTE(kahyeec @ Aug 22 2015, 08:11 PM)
As it turns out , the 980ti that I ordered from Amazon is Faulty on arrival. This is what I did and i assume what you all will need to do in the Event of a EVGA RMA :
Its not too hard , and it is a good way to test their famous RMA policies:
1. Get a screen shot of your Amazon invoice.
2. Register the card on their website . They will ask for that invoice and the serial number of the card.
3. Open a Technical Assistance Ticket , in the form you will need to fill in the FUN stuff like showing off the spec of your high end rig and of course why you think your card is faulty.
    Plenty of slots for photos , so you can show off your spec AND send them photos.
4. They will reply and ask you to do a few counter checks such as update bios , stick the card on another computer , slot the card in another slot bla bla bla.
5. you will reply them that you have done all those things and still the card is no go.
6. then you open a RMA ticket , they will approve your ticket if they think it is a faulty card.
7. Once you open a Technical Assist Ticket , everything is done via email , and if they approve your RMA Ticket , you will get a email saying that your RMA tiicket has been approved.
8. Send them your item bare naked , wraping it in bubble wrap and packing peanut in a box, they will tell you to send ONLY the card, you keep the packaging and posters and stickers and whatnot.
    they will be sending you back a bare card only also. but in a better , more classier packing. The address and whatnot will be sent to you in the approval email.
9. You will pay for the postage to EVGA Taiwan.You can choose expensive one like FedEx , DHL or cheaper ones like Poslaju EMS.  They will pay for the postage when they send you the replacement.
10.  They use TNT as their favorite courier in Malaysia.
11. My case i argued that it was a brand new card and it is a DOA case, requested them to pay both ways. They agreed . I drop the card off TNT office the Thursday before Hari Raya.
12. The replacement was sent out on the Monday after Hari Raya and I receive my brand new replacement card on Wednesday.
13. Register your new card after a few days of playing with it to start the warranty.

Total length of RMA around 20 days , taking into account there is a slight delay because of the 4th of July celebrations in the US, the typoon in Taiwan and Hari Raya.
All in all I am satisfied with their RMA process.
So dont worry too much about RMA  for EVGA cards.
*
Oh, it's sent to Taiwan and not US? At least the postage won't be that expensive, esp RMA later on when you have to pay yourself.
TSskylinelover
post Aug 22 2015, 08:48 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
Haha nice sharing that laugh.gif rclxms.gif
shikimori
post Aug 22 2015, 10:43 PM

Getting Started
**
Junior Member
236 posts

Joined: Jul 2007
From: Penang


My case rma 980ti ref card and ask for strix which they comply ok or not lol? But top up rm100

Should have asked to replace for free T_T

So far , im pleased with the 0db noise lol but kinda underwhelm with the oc ability cant push it more than 1480 gpu boost

This post has been edited by shikimori: Aug 22 2015, 10:46 PM


Attached thumbnail(s)
Attached Image
llk
post Aug 22 2015, 10:46 PM

Look at all my stars!!
*******
Senior Member
4,157 posts

Joined: Jan 2003
From: KL
QUOTE(shikimori @ Aug 22 2015, 10:43 PM)
My case rma 980ti ref card and ask for strix which they comply ok or not lol?  But top up rm100

Should have asked to replace for free T_T
*
Bro what is going on to your ref card?
I will put a waterblock by next week for my ref card.
shikimori
post Aug 22 2015, 10:48 PM

Getting Started
**
Junior Member
236 posts

Joined: Jul 2007
From: Penang


QUOTE(llk @ Aug 22 2015, 10:46 PM)
Bro what is going on to your ref card?
I will put a waterblock by next week for my ref card.
*
My rig suddenly auto shutdown with burning smell . When try to turn on again no power but if removed the card got power

Had to wait for 2 weeks till replacement

This post has been edited by shikimori: Aug 22 2015, 10:49 PM
Unseen83
post Aug 22 2015, 11:01 PM

TooFAT4U 2Handle!
*******
Senior Member
2,337 posts

Joined: Dec 2008
From: KING CANNEL JB


QUOTE(shikimori @ Aug 22 2015, 10:48 PM)
My rig suddenly auto shutdown with burning smell . When try to turn on again no power but if removed the card got power

Had to wait for 2 weeks till replacement
*
unsure.gif gosh... gtx 980 Ti Ref.. die oledy.. sad.gif xx well my R9 fury X send his condolence... guess tho super high clock oc got it limit.. eh ? but bright side you got brand new GTX 980 Ti 3 cooler rclxms.gif
shikimori
post Aug 22 2015, 11:22 PM

Getting Started
**
Junior Member
236 posts

Joined: Jul 2007
From: Penang


QUOTE(Unseen83 @ Aug 22 2015, 11:01 PM)
unsure.gif  gosh...  gtx 980 Ti Ref.. die oledy.. sad.gif xx well my R9 fury X send his condolence...  guess tho super high clock oc got it limit.. eh ?  but bright side you got brand new GTX 980 Ti 3 cooler  rclxms.gif
*
biggrin.gif thank you kind fury x owner . I hope my former 980 ti ref will rest in pieces peace .

The new card barely make any sound finally !!!! thumbup.gif no more hair dryer

Asic is around 71 is this okay ?
Unseen83
post Aug 22 2015, 11:30 PM

TooFAT4U 2Handle!
*******
Senior Member
2,337 posts

Joined: Dec 2008
From: KING CANNEL JB


QUOTE(shikimori @ Aug 22 2015, 11:22 PM)
biggrin.gif thank you kind fury x owner . I hope my former 980 ti ref will rest in pieces peace  .

The new card barely make any sound finally !!!! thumbup.gif  no more hair dryer

Asic is around 71 is this okay ?
*
eh haha icon_rolleyes.gif no need la oc to 1.4GHz, Fury x not chasing you... my fury x cant go higher 1.2Ghz.. game or bench crash.. lolx but temp is under 60C.. doh.gif is good to game 6-8 hour temp still under 60c smile.gif x

ppl on youtube says higher is good for OC



add-on GPU with HBM do not support ASIC... unsure.gif

This post has been edited by Unseen83: Aug 22 2015, 11:31 PM
SUScrash123
post Aug 22 2015, 11:33 PM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
QUOTE(shikimori @ Aug 22 2015, 11:22 PM)
biggrin.gif thank you kind fury x owner . I hope my former 980 ti ref will rest in pieces peace  .

The new card barely make any sound finally !!!! thumbup.gif  no more hair dryer

Asic is around 71 is this okay ?
*
I hear ASUS 980ti strix got no new stock coz ASUS got problem with their cooler. DCU III got 5 heat pipe but only 3 are made contact to the chip. And the card available around the world is old stock. Dont know if its true or not. Maybe if they refresh the cooler u can make a complain and u can get a new card laugh.gif

user posted image


shikimori
post Aug 22 2015, 11:59 PM

Getting Started
**
Junior Member
236 posts

Joined: Jul 2007
From: Penang


QUOTE(crash123 @ Aug 22 2015, 11:33 PM)
I hear ASUS 980ti strix got no new stock coz ASUS got problem with their cooler. DCU III got 5 heat pipe but only 3 are made contact to the chip. And the card available around the world is old stock. Dont know if its true or not. Maybe if they refresh the cooler u can make a complain and u can get a new card  laugh.gif

user posted image
*
LOL I think I had enough waiting for now . 2 weeks for me its like 2 months .. my backup 290 isnt that strong

Yeah , I heard people complaining that there is no cooling for the vram .
arifhasim85
post Aug 23 2015, 12:22 AM

Casual
***
Junior Member
351 posts

Joined: Sep 2010
From: alor setar, kedah


QUOTE(crash123 @ Aug 22 2015, 11:33 PM)
I hear ASUS 980ti strix got no new stock coz ASUS got problem with their cooler. DCU III got 5 heat pipe but only 3 are made contact to the chip. And the card available around the world is old stock. Dont know if its true or not. Maybe if they refresh the cooler u can make a complain and u can get a new card  laugh.gif

user posted image
*
its the same problem with previous DCU II.. haiz whats wrong with ASUS never learn from their past mistake..
JohnLai
post Aug 23 2015, 12:32 AM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
QUOTE(arifhasim85 @ Aug 23 2015, 12:22 AM)
its the same problem with previous DCU II.. haiz whats wrong with ASUS never learn from their past mistake..
*
-.- Probably because ASUS prefers to use Samsung GDDR5 chips which is quite good in overclocking with less heat production.

Overclocking from 7000Mhz (quad pumped) to 7800 and 8000 mhz doesnt result in much temperature increase at all.


SUScrash123
post Aug 23 2015, 12:44 AM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
QUOTE(arifhasim85 @ Aug 23 2015, 12:22 AM)
its the same problem with previous DCU II.. haiz whats wrong with ASUS never learn from their past mistake..
*
But Asus make a comeback with this statement. What a BS. Why dont just connect the chip with 5 heat sink or just make 3 fking big heatsink or just do like Gigabyte just one cooper plate. Damn u ASUS

user posted image
marfccy
post Aug 23 2015, 01:01 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(crash123 @ Aug 23 2015, 12:44 AM)
But Asus make a comeback with this statement. What a BS. Why dont just connect the chip with 5 heat sink or just make 3 fking big heatsink or just do like Gigabyte just one cooper plate. Damn u ASUS

user posted image
*
its the same for EVGA ACX cooler, the heatpipes dont contact with the chip fully
SUScrash123
post Aug 23 2015, 01:15 AM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
QUOTE(marfccy @ Aug 23 2015, 01:01 AM)
its the same for EVGA ACX cooler, the heatpipes dont contact with the chip fully
*
are u sure?? hmm.gif
ACX
user posted image
ACX 2.0
user posted image

This post has been edited by crash123: Aug 23 2015, 01:15 AM
marfccy
post Aug 23 2015, 01:23 AM

Le Ponyland!!!
*******
Senior Member
4,254 posts

Joined: Nov 2011


QUOTE(crash123 @ Aug 23 2015, 01:15 AM)
are u sure?? hmm.gif
ACX
user posted image
ACX 2.0
user posted image
*
my bad, maybe not ACX 2.0. this one is from GTX970 ACX
» Click to show Spoiler - click again to hide... «

SSJBen
post Aug 23 2015, 02:55 AM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(crash123 @ Aug 23 2015, 01:15 AM)
are u sure?? hmm.gif
ACX
user posted image
ACX 2.0
user posted image
*
Uhm, 2 pipes are still not in direct contact irregardless of the nickel copper plate.
TSskylinelover
post Aug 23 2015, 02:40 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(shikimori @ Aug 22 2015, 10:48 PM)
My rig suddenly auto shutdown with burning smell . When try to turn on again no power but if removed the card got power

Had to wait for 2 weeks till replacement
*
QUOTE(shikimori @ Aug 22 2015, 11:22 PM)
biggrin.gif thank you kind fury x owner . I hope my former 980 ti ref will rest in pieces peace  .

The new card barely make any sound finally !!!! thumbup.gif  no more hair dryer

Asic is around 71 is this okay ?
*
How did you kill your masterpiece ohmy.gif ohmy.gif ohmy.gif
shikimori
post Aug 23 2015, 03:01 PM

Getting Started
**
Junior Member
236 posts

Joined: Jul 2007
From: Penang


QUOTE(skylinelover @ Aug 23 2015, 02:40 PM)
How did you kill your masterpiece ohmy.gif ohmy.gif ohmy.gif
*
By being stupid .... Idk its just happens
SUScrash123
post Aug 23 2015, 10:10 PM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
Anyone know where I can find i7-6700k, IdealTech say the stock will arrived in 2 weeks but i see some people already sell their i5-6600k
heerosakura
post Aug 24 2015, 09:08 AM

Getting Started
**
Junior Member
175 posts

Joined: Aug 2012
QUOTE(crash123 @ Aug 23 2015, 10:10 PM)
Anyone know where I can find i7-6700k, IdealTech say the stock will arrived in 2 weeks but i see some people already sell their i5-6600k
*
lol here is discuss about graphic card especially for Nvidia
but you come here ask processor tongue.gif
Moogle Stiltzkin
post Aug 24 2015, 11:01 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(crash123 @ Aug 23 2015, 10:10 PM)
Anyone know where I can find i7-6700k, IdealTech say the stock will arrived in 2 weeks but i see some people already sell their i5-6600k
*
doh.gif

SUScrash123
post Aug 24 2015, 12:41 PM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
QUOTE(Moogle Stiltzkin @ Aug 24 2015, 11:01 AM)
doh.gif
*
Haha. Cant start complete my rig without cpu
Minecrafter
post Aug 24 2015, 04:12 PM

ROCK N ROLL STAR
*******
Senior Member
5,043 posts

Joined: Aug 2013
From: Putrajaya


QUOTE(crash123 @ Aug 24 2015, 12:41 PM)
Haha. Cant start complete my rig without cpu
*
But this is NVidia thread,not Intel..no reply needed. wink.gif
SUScrash123
post Aug 24 2015, 04:14 PM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
QUOTE(Minecrafter @ Aug 24 2015, 04:12 PM)
But this is NVidia thread,not Intel..no reply needed. wink.gif
*
Ok.jpeg sad.gif
Moogle Stiltzkin
post Aug 24 2015, 10:25 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(crash123 @ Aug 24 2015, 04:14 PM)
Ok.jpeg sad.gif
*
is okay... we all noob at one point smile.gif
antaras
post Aug 25 2015, 02:07 PM

Getting Started
**
Junior Member
196 posts

Joined: Jan 2010
From: Kuala Lumpur


http://www.hardwarecanucks.com/forum/hardw...0x-rematch.html

Good information. Not quite sure where're all those comments about Nvidia purposely gimp their Kepler's performance coming from.
SUScrash123
post Aug 25 2015, 11:21 PM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
QUOTE(antaras @ Aug 25 2015, 02:07 PM)
http://www.hardwarecanucks.com/forum/hardw...0x-rematch.html

Good information. Not quite sure where're all those comments about Nvidia purposely gimp their Kepler's performance coming from.
*
Wew. GTX 780ti performance almost on par with GTX 980

This post has been edited by crash123: Aug 25 2015, 11:21 PM
JohnLai
post Aug 25 2015, 11:46 PM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
QUOTE(antaras @ Aug 25 2015, 02:07 PM)
http://www.hardwarecanucks.com/forum/hardw...0x-rematch.html

Good information. Not quite sure where're all those comments about Nvidia purposely gimp their Kepler's performance coming from.
*
Probably from certain games features such as Project Cars (physics) and Witchers 3 (hairworks).
alfiejr
post Aug 26 2015, 12:14 AM

Gaming~
******
Senior Member
1,294 posts

Joined: Feb 2012
From: Taman Rasa Sayang, Cheras


QUOTE(crash123 @ Aug 25 2015, 11:21 PM)
Wew. GTX 780ti performance almost on par with GTX 980
*
Not on par. The 980 is definitely faster with the newer games. Likewise previous posters stated in games like witcher 3 , DAI , GTA V. I know because i can turn up some settings with the 980 where i couldnt do it with my previous 780ti. The 980 is also rock stable and i hadnt encounter any crashes as of yet.

Though with slightly older titles like shadow of mordor, i can see it a bit closer to the 780ti. But newer games seems to like the maxwell cards more so than the kepler yawn.gif
SUScrash123
post Aug 26 2015, 12:26 AM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
QUOTE(alfiejr @ Aug 26 2015, 12:14 AM)
Not on par. The 980 is definitely faster with the newer games. Likewise previous posters stated in games like witcher 3 , DAI , GTA V. I know because i can turn up some settings with the 980 where i couldnt do it with my previous 780ti. The 980 is also rock stable and i hadnt encounter any crashes as of yet.

Though with slightly older titles like shadow of mordor, i can see it a bit closer to the 780ti. But newer games seems to like the maxwell cards more so than the kepler yawn.gif
*
But i think Shadow Of Mordor really optimize. I think the optimized game of the year. AMD and NVIDIA performance is almost equal and I can run this game on ultra+MFAA+4k and can get 35-40fps never dipped below 30
Rei7
post Aug 26 2015, 04:13 PM

Game, anime and headphones ❤️
******
Senior Member
1,669 posts

Joined: Apr 2011



3 month back bought Galax GTX970 HOF.. and decided to get another one 2 days ago.
So lazy to sell back and get 980ti, plus need to fork more money.

Anyone has a good video or something on how benchmarking for DX12 is, for SLI 970 vs 980TI?

This post has been edited by Rei7: Aug 26 2015, 04:26 PM
Nickchong1314
post Aug 27 2015, 12:09 PM

New Member
*
Junior Member
9 posts

Joined: Apr 2015


Anyone has signed up for The NVIDIA Play The Future Event On September 4, 2015 @ The Coffee Club, Subang Jaya SS16?

NVIDIA is going to giveaway some awesome prizes like NVIDIA GeForce GTX 950 Graphic Card, Acer XB240H Monitor and Plextor M6V 128GB. rclxms.gif

user posted image

Register here

News source: Tech-critter
eatsleepnDIE
post Aug 27 2015, 12:10 PM

Getting Started
**
Junior Member
168 posts

Joined: Nov 2007


QUOTE(Nickchong1314 @ Aug 27 2015, 12:09 PM)
Anyone has signed up for The NVIDIA Play The Future Event On September 4, 2015 @ The Coffee Club, Subang Jaya SS16?

NVIDIA is going to giveaway some awesome prizes like NVIDIA GeForce GTX 950 Graphic Card, Acer XB240H Monitor and Plextor M6V 128GB. rclxms.gif

user posted image

Register here

News source:  Tech-critter
*
yep, going be there lol
Moogle Stiltzkin
post Aug 28 2015, 11:29 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(eatsleepnDIE @ Aug 27 2015, 12:10 PM)
yep, going be there lol
*
me2 smile.gif i'm calling dibs on the monitor flex.gif

ben you going ?

This post has been edited by Moogle Stiltzkin: Aug 28 2015, 12:24 PM
cstkl1
post Aug 28 2015, 12:16 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(Rei7 @ Aug 26 2015, 04:13 PM)
3 month back bought Galax GTX970 HOF.. and decided to get another one 2 days ago.
So lazy to sell back and get 980ti, plus need to fork more money.

Anyone has a good video or something on how benchmarking for DX12 is, for SLI 970 vs 980TI?
*
Min fps.

That where the smoothness differ

Same as ppl telling 1600ram vs 2400mhz or 8gb vs 16gb

Ti with the 6gb vram paired with 2400 16gb ram..
Min fps is where it shines vs a 970 sli setup

I tested my tb overclocked 1300/8000 vs single tx sli.Its the smoothness advantage of tx. Benchy numbers tb sli wipes the floor with a single tx.

This post has been edited by cstkl1: Aug 28 2015, 12:17 PM
Moogle Stiltzkin
post Aug 28 2015, 12:22 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
Ashes of the Singularity Benchmark
http://www.ign.com/articles/2015/08/17/fir...nchmark-results
http://wccftech.com/ashes-singularity-alpha-dx12-benchmark/

nvidia says this game is not matured enough to be considered to be a proper benchmark for dx12... so.... just take the results for now with a grain of salt.

This post has been edited by Moogle Stiltzkin: Aug 28 2015, 12:23 PM
Rei7
post Aug 28 2015, 12:32 PM

Game, anime and headphones ❤️
******
Senior Member
1,669 posts

Joined: Apr 2011



QUOTE(cstkl1 @ Aug 28 2015, 12:16 PM)
Min fps.

That where the smoothness differ

Same as ppl telling 1600ram vs 2400mhz or 8gb vs 16gb

Ti with the 6gb vram paired with 2400 16gb ram..
Min fps is where it shines vs a 970 sli setup

I tested my tb overclocked 1300/8000 vs single tx sli.Its the smoothness advantage of tx. Benchy numbers tb sli wipes the floor with a single tx.
*
As expected nod.gif
Just wondering what that DX12 can do with SLI.
Don't seems to have much of videos and such on those yet.
Moogle Stiltzkin
post Aug 28 2015, 12:35 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(Rei7 @ Aug 28 2015, 12:32 PM)
As expected  nod.gif
Just wondering what that DX12 can do with SLI.
Don't seems to have much of videos and such on those yet.
*
from what i heard dx12 makes multi gpus much more compatible.

DirectX 12 Will Allegedly Allow Multi-GPU use Between Nvidia and AMD Cards
QUOTE
According to the source, the API will be able to combine different graphics resources and pool all those resources together. Rather than having multiple GPUs rendering an Alternate Frame (AFR) each, there is a new method called Split Frame Rendering (SFR) that is being introduced. With this feature, developers will be able to automatically, or manually, divide texture and geometry data between GPUs that will be able to work together on each frame and be designated a separate portion of the screen for each GPU.

Unlike AFR, that requires both cards to have all of the data in their frame buffers and leaving the user to have a 4GB frame buffer even though there are two cards with 4GB of memory each.

This will, says the source, significantly reduce latency.

Yet SFR isn’t new, as AMD’s Mantle API supports it and applications out there that do as well (see how Mantle performs against DirectX 11). What is surprising is that the source went on to say that DirectX 12 will support all of this across different GPU architecures allowing AMD Radeon and Nvidia GeForce GPUs to work together to render the same game.

However, while this sounds great, it will still be up to developers to make use of, and utilize, Explicit Asynchronous Multi-GPU Capabilities for their games and software.

http://www.maximumpc.com/directx-12-will-a...-and-amd-2015/#!


if i'm not mistaken this also would mean 2 gpus with 4gb vram each would effectively be 4+4 = 8vram, rather than 4gb vram prior dx12 hmm.gif


This post has been edited by Moogle Stiltzkin: Aug 28 2015, 12:37 PM
Minecrafter
post Aug 28 2015, 12:46 PM

ROCK N ROLL STAR
*******
Senior Member
5,043 posts

Joined: Aug 2013
From: Putrajaya


QUOTE(Moogle Stiltzkin @ Aug 28 2015, 12:35 PM)
from what i heard dx12 makes multi gpus much more compatible.

DirectX 12 Will Allegedly Allow Multi-GPU use Between Nvidia and AMD Cards

http://www.maximumpc.com/directx-12-will-a...-and-amd-2015/#!
if i'm not mistaken this also would mean 2 gpus with 4gb vram each would effectively be 4+4 = 8vram, rather than 4gb vram prior dx12 hmm.gif
*
I hope that will happen. thumbup.gif
Moogle Stiltzkin
post Aug 28 2015, 12:53 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(Minecrafter @ Aug 28 2015, 12:46 PM)
I hope that will happen. thumbup.gif
*
yeah multi gpu more bang for your buck. but i still wouldn't go multi gpu :} expensive. not to mention i don't think this will change multi gpus being more dependent on driver updates compared to single gpu solutions.
cstkl1
post Aug 28 2015, 12:54 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(Moogle Stiltzkin @ Aug 28 2015, 12:35 PM)
from what i heard dx12 makes multi gpus much more compatible.

DirectX 12 Will Allegedly Allow Multi-GPU use Between Nvidia and AMD Cards

http://www.maximumpc.com/directx-12-will-a...-and-amd-2015/#!
if i'm not mistaken this also would mean 2 gpus with 4gb vram each would effectively be 4+4 = 8vram, rather than 4gb vram prior dx12  hmm.gif
*
Think wont work with nvidia
Nvidia tends to do alot of firmware n architecture thingy.
Hence y lower overhead on cpu before from driver.

So thats y why u see on their roadmap theres unified mem/virtual mem etc.

Dx12 feature set etc was done really quickly n i am pretty sure it threw some wrenches into nvidia roadmap plans. Since nvidia tends to do two gen forward r&d

This post has been edited by cstkl1: Aug 28 2015, 12:56 PM
Moogle Stiltzkin
post Aug 28 2015, 01:02 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Aug 28 2015, 12:54 PM)
Think wont work with nvidia
Nvidia tends to do alot of firmware n architecture thingy.
Hence y lower overhead on cpu before from driver.

So thats y why u see on their roadmap theres unified mem/virtual mem etc.

Dx12 feature set etc was done really quickly n i am pretty sure it threw some wrenches into nvidia roadmap plans. Since nvidia tends to do two gen forward r&d
*
oh that sucks :{

so.... if wont work with amd gpus... then what about other nvidia gpus ? hmm.gif

when i attend this event, will see if they mention on that hmm.gif
cstkl1
post Aug 28 2015, 01:04 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(Moogle Stiltzkin @ Aug 28 2015, 01:02 PM)
oh that sucks :{

so.... if wont work with amd gpus... then what about other nvidia gpus ?  hmm.gif

when i attend this event, will see if they mention on that  hmm.gif
*
That event. Will still be about gsync, dx12, etc same as 980ti event.


Moogle Stiltzkin
post Aug 28 2015, 01:05 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(cstkl1 @ Aug 28 2015, 01:04 PM)
That event. Will still be about gsync, dx12, etc same as 980ti event.
*
are you refering to the one in singapore ? or was there alrdy one in my :/ ? i must have missed that one then sad.gif
cstkl1
post Aug 28 2015, 01:06 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(Moogle Stiltzkin @ Aug 28 2015, 01:05 PM)
are you refering to the one in singapore ? or was there alrdy one in my :/ ? i must have missed that one then  sad.gif
*
Both.
Nvidia is doing more campaigning since they are flushed with cash. Not killing off amd but more like ensuring amd wont bounce back.
Minecrafter
post Aug 28 2015, 02:46 PM

ROCK N ROLL STAR
*******
Senior Member
5,043 posts

Joined: Aug 2013
From: Putrajaya


QUOTE(Moogle Stiltzkin @ Aug 28 2015, 12:53 PM)
yeah multi gpu more bang for your buck. but i still wouldn't go multi gpu :} expensive. not to mention i don't think this will change multi gpus being more dependent on driver updates compared to single gpu solutions.
*
Yeah,the optimization problems will still be there. yawn.gif Bu still,i don't think it's worth it to SLI or CFX mid-range or low-range cards. tongue.gif
Rei7
post Aug 28 2015, 03:34 PM

Game, anime and headphones ❤️
******
Senior Member
1,669 posts

Joined: Apr 2011



QUOTE(Moogle Stiltzkin @ Aug 28 2015, 12:35 PM)
if i'm not mistaken this also would mean 2 gpus with 4gb vram each would effectively be 4+4 = 8vram, rather than 4gb vram prior dx12  hmm.gif
*
Yeah full 8gb with DX12, so i've heard. No more 50% only for the 2nd card and stuff.
Exciting times, but still depends on developers to fully utilize DX12.


SSJBen
post Aug 28 2015, 03:35 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


Dell just announced their first G-sync monitor: S2716DG.
1440p, 144hz, G-sync and ULMB.

At first I was like awhhh yissss!! Then, I found out it's a TN panel. Fudge. doh.gif

/waitingcontinues


*EDIT*
Yes, I know the TN panels on these sets will be of the highest of end panels. And yes I know IPS has its own set of issues too, backlight bleed and IPS glow are notorious on them. But damn, having used IPS for several years now, it is difficult to go back down to a different color gamut. It's just weird not being able to tell gradients apart, something which even the best TN panels cannot cope with.

This post has been edited by SSJBen: Aug 28 2015, 03:38 PM
TSskylinelover
post Aug 29 2015, 02:42 AM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(Minecrafter @ Aug 28 2015, 02:46 PM)
Yeah,the optimization problems will still be there.  yawn.gif  Bu still,i don't think it's worth it to SLI or CFX mid-range or low-range cards. tongue.gif
*
Haha well said laugh.gif rclxms.gif

QUOTE(Rei7 @ Aug 28 2015, 03:34 PM)
Yeah full 8gb with DX12, so i've heard. No more 50% only for the 2nd card and stuff.
Exciting times, but still depends on developers to fully utilize DX12.
*
DX12 still baby stage. So i aint jumping 2 soon yet. Hoping crysis 4 will be the first ever DX12 game since the previous sequence of farcry 1 first dx9 game crysis 1 first dx10 game and finally crysis 2 first dx11 game. laugh.gif rclxms.gif
TSskylinelover
post Aug 29 2015, 02:45 AM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(cstkl1 @ Aug 28 2015, 01:06 PM)
Both.
Nvidia is doing more campaigning since they are flushed with cash. Not killing off amd but more like ensuring amd wont bounce back.
*
Haha this i like

QUOTE(SSJBen @ Aug 28 2015, 03:35 PM)
Dell just announced their first G-sync monitor: S2716DG.
1440p, 144hz, G-sync and ULMB.

At first I was like awhhh yissss!! Then, I found out it's a TN panel. Fudge. doh.gif

/waitingcontinues
*EDIT*
Yes, I know the TN panels on these sets will be of the highest of end panels. And yes I know IPS has its own set of issues too, backlight bleed and IPS glow are notorious on them. But damn, having used IPS for several years now, it is difficult to go back down to a different color gamut. It's just weird not being able to tell gradients apart, something which even the best TN panels cannot cope with.
*
Same lo. Never looked back at TN anymore after owning IPS for more than a year already. Haha.
SUSngkhanmein
post Sep 1 2015, 12:00 AM

カラカラ Karakara
*******
Senior Member
7,727 posts

Joined: Jan 2010
From: Ara Damansara, Petaling Jaya & Batu Pahat, Johor.


http://www.guru3d.com/news-story/nvidia-wa...n-settings.html

"Currently is seems that Nvidias Maxwell architecture (Series 900 cards) does not really support Asynchronous compute in DX12 at a proper hardware level. Meanwhile AMD is obviously jumping onto this a being HUGE and they quickly prepared a PDF slide presentation with their take on the importance of all this. Normally I'd share add the slides into a news item, but this is 41 page of content slides, hence I made it available as separate download.

In short, here's the thing, everybody expected NVIDIA Maxwell architecture to have full DX12 support, as it now turns out, that is not the case. AMD offers support on their Fury and Hawaii/Grenada/Tonga (GCN 1.2) architecture for DX12 asynchronous compute shaders. The rather startling news is that Nvidia's Maxwell architecture, and yeah that would be the entire 900 range does not support it. I can think of numerous scenarios as to where asynchronous shaders would help."
Unseen83
post Sep 1 2015, 04:35 AM

TooFAT4U 2Handle!
*******
Senior Member
2,337 posts

Joined: Dec 2008
From: KING CANNEL JB


QUOTE(ngkhanmein @ Sep 1 2015, 12:00 AM)
http://www.guru3d.com/news-story/nvidia-wa...n-settings.html

"Currently is seems that Nvidias Maxwell architecture (Series 900 cards) does not really support Asynchronous compute in DX12 at a proper hardware level. Meanwhile AMD is obviously jumping onto this a being HUGE and they quickly prepared a PDF slide presentation with their take on the importance of all this. Normally I'd share add the slides into a news item, but this is 41 page of content slides, hence I made it available as separate download.

In short, here's the thing, everybody expected NVIDIA Maxwell architecture to have full DX12 support, as it now turns out, that is not the case. AMD offers support on their Fury and Hawaii/Grenada/Tonga (GCN 1.2) architecture for DX12 asynchronous compute shaders. The rather startling news is that Nvidia's Maxwell architecture, and yeah that would be the entire 900 range does not support it. I can think of numerous scenarios as to where asynchronous shaders would help."
*
but but Nvidia says...

http://blogs.nvidia.com/blog/2015/01/21/wi...10-nvidia-dx12/

" We’re more than ready. GPUs built on our Maxwell GPU architecture – such as our recently released GeForce GTX 970 and GeForce GTX 980 – fully support DX12. "

This post has been edited by Unseen83: Sep 1 2015, 05:06 AM
JohnLai
post Sep 1 2015, 11:02 AM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
Hmm.......well......Oxide better ensure its claim is true.....otherwise, if nvidia 'magically' fixes the async issue (even if it up to 31/32 queue), Oxide reputation will be questionable.
SUSngkhanmein
post Sep 1 2015, 11:35 AM

カラカラ Karakara
*******
Senior Member
7,727 posts

Joined: Jan 2010
From: Ara Damansara, Petaling Jaya & Batu Pahat, Johor.


QUOTE(Unseen83 @ Sep 1 2015, 04:35 AM)
but but Nvidia says...

http://blogs.nvidia.com/blog/2015/01/21/wi...10-nvidia-dx12/

" We’re more than ready. GPUs built on our Maxwell GPU architecture – such as our recently released GeForce GTX 970 and GeForce GTX 980 – fully support DX12. "
*
support doesn't mean will include everything cool2.gif
Moogle Stiltzkin
post Sep 1 2015, 12:33 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
well it was speculated that nvidia does have async compute, but that most likely it's emulated, considering this recent issue by oxide, then that could explain their current results.

doh.gif but that is still speculation, though a rather logical one judging by the results so far.
Ronzph
post Sep 1 2015, 05:24 PM

Getting Started
**
Junior Member
120 posts

Joined: Jul 2009
From: Pluto



This --> MSI 980TI Lightning

Any1 Getting this ???

TSskylinelover
post Sep 1 2015, 05:27 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(Moogle Stiltzkin @ Sep 1 2015, 12:33 PM)
well it was speculated that nvidia does have async compute, but that most likely it's emulated, considering this recent issue by oxide, then that could explain their current results.

doh.gif but that is still speculation, though a rather logical one judging by the results so far.
*
Haha i guess is very true we better wait pascal laugh.gif
SUSngkhanmein
post Sep 1 2015, 05:55 PM

カラカラ Karakara
*******
Senior Member
7,727 posts

Joined: Jan 2010
From: Ara Damansara, Petaling Jaya & Batu Pahat, Johor.


QUOTE(Moogle Stiltzkin @ Sep 1 2015, 12:33 PM)
well it was speculated that nvidia does have async compute, but that most likely it's emulated, considering this recent issue by oxide, then that could explain their current results.

doh.gif but that is still speculation, though a rather logical one judging by the results so far.
*
even i'm NV fan-boy also can't denied DX12 is goin to favor for red side. previously NV driver is superior but now different story. inikali DX11 we owned but DX12 lain kali-lah.. sweat.gif


Unseen83
post Sep 1 2015, 06:20 PM

TooFAT4U 2Handle!
*******
Senior Member
2,337 posts

Joined: Dec 2008
From: KING CANNEL JB


QUOTE(ngkhanmein @ Sep 1 2015, 11:35 AM)
support doesn't mean will include everything cool2.gif
*
the Quote: Focus on "Fully support" cool2.gif

We’re more than ready. GPUs built on our Maxwell GPU architecture – such as our recently released GeForce GTX 970 and GeForce GTX 980 – fully support DX12.

http://blogs.nvidia.com/blog/2015/01/21/wi...10-nvidia-dx12/

This post has been edited by Unseen83: Sep 1 2015, 06:20 PM
Demonic Wrath
post Sep 1 2015, 08:41 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

Funny how NVIDIA actually losses performance when going for DX12 API at Ashes of Singularity benchmark.. maybe they optimized their DX11 drivers too well?

Support async or not, what's important is the actual FPS of the game.. if anyone is quoting Ashes of Singularity benchmark saying AMD has better implementation, check again.. R9 390X is also performing close to R9 Fury too. (source: http://www.pcgameshardware.de/Ashes-of-the...tX-11-1167997/) It is just that AMD DX11 implementation is so bad that it makes DX12 looks very good.

One thing we know for sure, currently NVIDIA has the market share (82%!). Who knows what will happen to future DX12 games, especially those GameWorks titles.
Unseen83
post Sep 1 2015, 10:18 PM

TooFAT4U 2Handle!
*******
Senior Member
2,337 posts

Joined: Dec 2008
From: KING CANNEL JB


QUOTE(Demonic Wrath @ Sep 1 2015, 08:41 PM)
Funny how NVIDIA actually losses performance when going for DX12 API at Ashes of Singularity benchmark.. maybe they optimized their DX11 drivers too well?

Support async or not, what's important is the actual FPS of the game.. if anyone is quoting Ashes of Singularity benchmark saying AMD has better implementation, check again.. R9 390X is also performing close to R9 Fury too. (source: http://www.pcgameshardware.de/Ashes-of-the...tX-11-1167997/) It is just that AMD DX11 implementation is so bad that it makes DX12 looks very good.

One thing we know for sure, currently NVIDIA has the market share (82%!). Who knows what will happen to future DX12 games, especially those GameWorks titles.
*
yeah funny indeed... but NVIDIA is telling/pressure Oxide to disable async compute/shader feature on the bench.

" The rather startling news is that Nvidia's Maxwell architecture, and yeah that would be the entire 900 range does not support it in the way AMD does. I can think of numerous scenarios as to where asynchronous shaders would help."

http://www.guru3d.com/news-story/nvidia-wa...n-settings.html

add-on: oh bro your signature correction on "MSI NVIDIA GeForce GTX970 Gaming 4GB GDDR5" is 3.5GB GDDR5 icon_rolleyes.gif

This post has been edited by Unseen83: Sep 1 2015, 10:20 PM
Demonic Wrath
post Sep 1 2015, 11:21 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

QUOTE(Unseen83 @ Sep 1 2015, 10:18 PM)
yeah funny indeed...  but NVIDIA is telling/pressure Oxide to disable async compute/shader feature on the bench.

" The rather startling news is that Nvidia's Maxwell architecture, and yeah that would be the entire 900 range does not support it in the way AMD does. I can think of numerous scenarios as to where asynchronous shaders would help."

http://www.guru3d.com/news-story/nvidia-wa...n-settings.html

add-on: oh bro your signature correction on "MSI NVIDIA GeForce GTX970 Gaming 4GB GDDR5" is 3.5GB GDDR5  icon_rolleyes.gif
*
Huh? You mean the GTX970 has physically only 3.5GB GDDR5? Ok...thanks for info

This post has been edited by Demonic Wrath: Sep 1 2015, 11:22 PM
Minecrafter
post Sep 1 2015, 11:25 PM

ROCK N ROLL STAR
*******
Senior Member
5,043 posts

Joined: Aug 2013
From: Putrajaya


QUOTE(Unseen83 @ Sep 1 2015, 10:18 PM)
yeah funny indeed...  but NVIDIA is telling/pressure Oxide to disable async compute/shader feature on the bench.

" The rather startling news is that Nvidia's Maxwell architecture, and yeah that would be the entire 900 range does not support it in the way AMD does. I can think of numerous scenarios as to where asynchronous shaders would help."

http://www.guru3d.com/news-story/nvidia-wa...n-settings.html

add-on: oh bro your signature correction on "MSI NVIDIA GeForce GTX970 Gaming 4GB GDDR5" is 3.5GB GDDR5  icon_rolleyes.gif
*
I've seen some of these for the past months since the GTX970's "RAMgate"... biggrin.gif

It has 4GB GDDR5,but 3.5GB is fast,0.5GB is slow.

No need to make a big problem out of it. wink.gif
JohnLai
post Sep 1 2015, 11:46 PM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
QUOTE(Demonic Wrath @ Sep 1 2015, 11:21 PM)
Huh? You mean the GTX970 has physically only 3.5GB GDDR5? Ok...thanks for info
*
QUOTE(Minecrafter @ Sep 1 2015, 11:25 PM)
I've seen some of these for the past months since the GTX970's "RAMgate"... biggrin.gif

It has 4GB GDDR5,but 3.5GB is fast,0.5GB is slow.

No need to make a big problem out of it. wink.gif
*
Wait until you tell him that GTX970 has only 1.75 MB L2 Cache and 56 ROP instead of marketed 2MB and 64 ROP.

Because of 970 memory segmentation, the GPU VRAM bandwidth is effectively 196GB/s due to 224bit bus width.
Originally it is advertised to have 224GB/s and 256bit bus width.

Unseen83
post Sep 1 2015, 11:51 PM

TooFAT4U 2Handle!
*******
Senior Member
2,337 posts

Joined: Dec 2008
From: KING CANNEL JB


QUOTE(Minecrafter @ Sep 1 2015, 11:25 PM)
I've seen some of these for the past months since the GTX970's "RAMgate"... biggrin.gif

It has 4GB GDDR5,but 3.5GB is fast,0.5GB is slow.

No need to make a big problem out of it. wink.gif
*
oh so total 4GB but on separate partition/speed 3.5GB on faster or ads speed meanwhile 0.5gb on slow or half ads speed hmm.gif okay thanx for that info biggrin.gif

QUOTE(Demonic Wrath @ Sep 1 2015, 11:21 PM)
Huh? You mean the GTX970 has physically only 3.5GB GDDR5? Ok...thanks for info
*
naah i was wrong.. gtx 970 have total 4gb gddr5 laugh.gif
Unseen83
post Sep 1 2015, 11:56 PM

TooFAT4U 2Handle!
*******
Senior Member
2,337 posts

Joined: Dec 2008
From: KING CANNEL JB


QUOTE(JohnLai @ Sep 1 2015, 11:46 PM)
Wait until you tell him that GTX970 has only 1.75 MB L2 Cache and 56 ROP instead of marketed 2MB and 64 ROP.

Because of 970 memory segmentation, the GPU VRAM bandwidth is effectively 196GB/s due to 224bit bus width.
Originally it is advertised to have 224GB/s and 256bit bus width.
*
ohmy.gif really.. wow.. i did not know that also.. hmm imagine i almost go for GTX970 Sli in exchange R9 290x crx
JohnLai
post Sep 2 2015, 12:09 AM

Skeptical Cat
*******
Senior Member
3,669 posts

Joined: Apr 2006
QUOTE(Unseen83 @ Sep 1 2015, 11:56 PM)
ohmy.gif  really.. wow.. i did not know that also.. hmm  imagine i almost go for GTX970 Sli in exchange R9 290x crx
*
Lucky for you, you should know the weird frame time issue when GTX970 are SLI-ed..........?

http://www.pcper.com/reviews/Graphics-Card...sues-Tested-SLI

While GTX970 can use and address the 0.5GB (32bit) just fine, but using it will block the access to the faster 3.5GB(224bit).
If program reads 3.5gb (default drivers behavior is to prioritize this 224bit segment), then it cant read 0.5gb and vice versa for writing operation.

It can only access both partitions if first partition is reading while the second partition is writing.

SUScrash123
post Sep 2 2015, 12:35 AM

Getting Started
**
Junior Member
271 posts

Joined: Aug 2011
QUOTE(Ronzph @ Sep 1 2015, 05:24 PM)
This --> MSI 980TI Lightning

Any1 Getting this ???
*
To expensive and GM200 is not best overclocker card. Buy reference GTX 980ti+G10+H55 is enough if u dont like aftermarket cooler. Can get 55-60 celcius max OC load

This post has been edited by crash123: Sep 2 2015, 12:36 AM
Moogle Stiltzkin
post Sep 2 2015, 03:22 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(ngkhanmein @ Sep 1 2015, 05:55 PM)
even i'm NV fan-boy also can't denied DX12 is goin to favor for red side. previously NV driver is superior but now different story. inikali DX11 we owned but DX12 lain kali-lah..  sweat.gif
*


looking into this async compute issue.... regarding how nvidia cards would perform in dx12 games has got me thinking twice about pascal.

i'm just gonna have to wait and see what the reviewers say on that matter.

when it comes out and they test on ash singularity or another newer dx12 game, then can know whether pascal will last me a few years (with being able to play dx12 games well in mind) or not hmm.gif

if not then choice would be go for amd (which apparently was not a total flop because they actually did async compute the right way despite being a power guzzler), whereas nvidia probly did some emulation method that could do it albeit poorly hmm.gif

nvidia telling ash to disable async compute ... does not inspire much confidence sad.gif

This post has been edited by Moogle Stiltzkin: Sep 2 2015, 03:23 AM
TSskylinelover
post Sep 2 2015, 06:10 AM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,244 posts

Joined: Jul 2005
QUOTE(Moogle Stiltzkin @ Sep 2 2015, 03:22 AM)


looking into this async compute issue.... regarding how nvidia cards would perform in dx12 games has got me thinking twice about pascal.

i'm just gonna have to wait and see what the reviewers say on that matter.

when it comes out and they test on ash singularity or another newer dx12 game, then can know whether pascal will last me a few years (with being able to play dx12 games well in mind) or not  hmm.gif

if not then choice would be go for amd (which apparently was not a total flop because they actually did async compute the right way despite being a power guzzler), whereas nvidia probly did some emulation method that could do it albeit poorly  hmm.gif

nvidia telling ash to disable async compute ... does not inspire much confidence  sad.gif
*
dammit i buying console after this laugh.gif doh.gif no DX12 racing games will be out soon hahahaha
Demonic Wrath
post Sep 2 2015, 08:50 AM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

Since both all graphics hardware vendor doesn't have a card that can fully support DX12, just wait till 2016 or 2017 to get a full DX12 feature support card.. but then again, maybe that time got DX12.1 feature?

Edit: For those who really needs full DX12 support, just wait till DX12 games are released first then only see benchmarks..

This post has been edited by Demonic Wrath: Sep 2 2015, 09:35 AM
Unseen83
post Sep 2 2015, 10:45 AM

TooFAT4U 2Handle!
*******
Senior Member
2,337 posts

Joined: Dec 2008
From: KING CANNEL JB


QUOTE(Demonic Wrath @ Sep 2 2015, 08:50 AM)
Since both all graphics hardware vendor doesn't have a card that can fully support DX12, just wait till 2016 or 2017 to get a full DX12 feature support card.. but then again, maybe that time got DX12.1 feature?

Edit: For those who really needs full DX12 support, just wait till DX12 games are released first then only see benchmarks..
*
but but.. AMD never Claim to fully Support DX12, meanwhile Nvidio proudly claim it's Maxwell FULLY DX12 support" but like you said.. Nvidia own 82% market so got all money so maybe they can pay off game dev to delay using dx12.. while they conjure up pascal, but at the end is about one dishonest company fool its customer/fan... sad.gif
Moogle Stiltzkin
post Sep 2 2015, 11:03 AM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE
Well its not as simple as just wait for a comment.

Here is the truth. Nvidia claimed Async Compute, technically they could argue since they have 32 compute engines they could be considered async however they do not actually fully work the same way people refer to on AMD's Async compute when we talk with async shaders.

Its more like Nvidia was misleading people like they did with the 4gb vram rather than straight up lie. Saying async but having only serial compute is kinda wrong to claim.

https://www.reddit.com/r/nvidia/comments/3j...nvidia_cant_do/


:/

dx11 nvidia clearly wins, but come dx12 with all sorts of performance tricks you can do, this may change doh.gif needs closer look when benchmarks are out. but judging by all the technical mumbo jumbo, it's not looking good for 980ti doh.gif

sure hardly any dx12 games out yet, but most people buy a card to last 4 maybe 5 years. so you wouldn't want to change out your card lesser than that just to play a dx12 game as well it should have been hmm.gif

maybe the performance penalty isn't too bad, or better yet pascal fixes this ? so have to wait for benchmarks doh.gif

This post has been edited by Moogle Stiltzkin: Sep 2 2015, 11:07 AM
Rei7
post Sep 2 2015, 11:09 AM

Game, anime and headphones ❤️
******
Senior Member
1,669 posts

Joined: Apr 2011



QUOTE(Moogle Stiltzkin @ Sep 2 2015, 11:03 AM)
https://www.reddit.com/r/nvidia/comments/3j...nvidia_cant_do/
:/

dx11 nvidia clearly wins, but come dx12 with all sorts of performance tricks you can do, this may change doh.gif needs closer look when benchmarks are out. but judging by all the technical mumbo jumbo, it's not looking good for 980ti doh.gif
*
980ti only? hmm.gif
eatsleepnDIE
post Sep 2 2015, 11:29 AM

Getting Started
**
Junior Member
168 posts

Joined: Nov 2007


QUOTE(Rei7 @ Sep 2 2015, 11:09 AM)
980ti only?  hmm.gif
*
all 9 series i guess.
SUSngkhanmein
post Sep 2 2015, 11:35 AM

カラカラ Karakara
*******
Senior Member
7,727 posts

Joined: Jan 2010
From: Ara Damansara, Petaling Jaya & Batu Pahat, Johor.


QUOTE(Unseen83 @ Sep 2 2015, 10:45 AM)
but but.. AMD never Claim to fully Support DX12, meanwhile Nvidio proudly claim it's Maxwell FULLY DX12 support" but like you said.. Nvidia  own 82% market so got all money so maybe they can pay off game dev to delay using dx12.. while they conjure up pascal, but at the end is about one dishonest company fool its customer/fan...  sad.gif
*
i saw ur post so many times, ini kali-lah ur statement, i fully agreed with u. NV purposely pay them to delay. i believe pascal is fully support DX12 but for maxwell users lain kali-lah.

maxwell is partially support for testing purpose on DX12 only. this is a game goin release soon which is favor AMD but look at NV performance really disaster.

i'm curious 980Ti can hit 6GB vram at the peak??? hmm.gif

i remembered NV TOM claimed above 970 is fully support DX12
Moogle Stiltzkin
post Sep 2 2015, 12:23 PM

Look at all my stars!!
*******
Senior Member
4,451 posts

Joined: Jan 2003
QUOTE(Rei7 @ Sep 2 2015, 11:09 AM)
980ti only?  hmm.gif
*
not sure about others. only cared about the best single gpu nvidia has atm doh.gif

even with this dx12 concern when it comes down to async compute which a game like rts genre that spawns many units like AOTS was demonstrating how rts could be like by increasing efficiency/performance via dx12 is possible. the gains from async compute if the developer uses it, seems to be quite good on paper :]

will come down to fps and latency, whether the difference is huge to warrant switching camps

user posted image

QUOTE
Results, Heavy
This set of benchmarks uses only the frame times and averages from the “heavy” third of the benchmark scenes. In theory, this should put more emphasis on the DX12 and CPU implementation for each combination of hardware.

user posted image

http://www.pcper.com/reviews/Graphics-Card...abs_block_tab=1


from the benchmark it just clearly shows the amd cards have huge jumps in performance in dx12 compared to dx11. so although the difference in end result is by a small margin between the 2 cards, it brings up the question, could nvidia performance had been much better had they used better async compute like amd ? nvidia async compute now being scrutinized for being inferior to amds judging by this result.

might be that in pascal the dx12 leveraging would be the same as maxwell, and this is what people waiting to be answered :/

but if the results is more or less same, dx 11 will be better than amd, and dx12 only slightly worse than amd, so nvidia may still be a better choice albeit disappointing they didn't fully leverage on dx12 async compute (cause the benchmark shows it does make a difference, if a game decided to use it e.g. rts especially).

for VR, people may prefer amd for the lower latency compared to nvidia. not sure the latency will be that big to be a problem for non vr games in regards to async compute hmm.gif



Demonic Wrath
post Sep 2 2015, 12:34 PM

My name so cool
******
Senior Member
1,667 posts

Joined: Jan 2003
From: The Cool Name Place

From AoTS benchmark, it seems that NVIDIA's DX11 without async support performs faster/similarly to AMD's DX12 with async compute. (source) edit: see frame rate by batch type.

Secondly, Anandtech (source) previously show that NVIDIA does benefit from DX12 implementation too. StarSwarm is also developed by Oxide.

This post has been edited by Demonic Wrath: Sep 2 2015, 12:35 PM
Unseen83
post Sep 4 2015, 05:45 PM

TooFAT4U 2Handle!
*******
Senior Member
2,337 posts

Joined: Dec 2008
From: KING CANNEL JB


QUOTE(ngkhanmein @ Sep 2 2015, 11:35 AM)
i saw ur post so many times, ini kali-lah ur statement, i fully agreed with u. NV purposely pay them to delay. i believe pascal is fully support DX12 but for maxwell users lain kali-lah.

maxwell is partially support for testing purpose on DX12 only. this is a game goin release soon which is favor AMD but look at NV performance really disaster.

i'm curious 980Ti can hit 6GB vram at the peak???  hmm.gif

i remembered NV TOM claimed above 970 is fully support DX12
*
or it could be lie from Oxide and AMD as they working together to bring down Nvidia... is like how Nvidia told game dev add gamework that cripple gpu performance hmm .. hmm.gif



https://www.youtube.com/results?search_query=FaWbDpEuuk

This post has been edited by Unseen83: Sep 4 2015, 05:47 PM
SUSngkhanmein
post Sep 4 2015, 06:36 PM

カラカラ Karakara
*******
Senior Member
7,727 posts

Joined: Jan 2010
From: Ara Damansara, Petaling Jaya & Batu Pahat, Johor.




nvm i still enjoy playing DX11 games. too many can't finish at all..

5 Pages  1 2 3 > » Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.2117sec    0.43    6 queries    GZIP Disabled
Time is now: 29th November 2025 - 01:03 AM