Going to buy this!
NVIDIA GeForce Community V13
NVIDIA GeForce Community V13
|
|
Sep 2 2014, 04:35 PM
Return to original view | Post
#1
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
ASUS G-SYNC monitor swift PG278Q is available in MY for RM2769.
Going to buy this! |
|
|
|
|
|
Sep 2 2014, 05:05 PM
Return to original view | Post
#2
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
|
|
|
Sep 19 2014, 08:17 PM
Return to original view | Post
#3
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
|
|
|
Sep 19 2014, 11:47 PM
Return to original view | Post
#4
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
nvidia g-sync is in trouble with this recent news:
http://www.marketwatch.com/story/industrys...sync-2014-09-18 So, basically freesync isn't just about dp 1.2a out of the box. It require new scaler/asic just like nvidia g-sync module. I've always sceptical about DP 1.2a doesnt require changing of current scaler, and AMD always been tight lip about this. Nvidia was right, that 1.2a dp is not the same as g-sync, because this implementation on LCD requires new hardware. So now, it is reveal that it does require manufacturer to change the scaler which is since the beginning of lcd existent this wasn't even touch. This will change the whole industry. And nvidia is in trouble if these OEM scaler cost much cheaper than nvidia g-sync, and of course it is VESA. |
|
|
Sep 22 2014, 06:24 PM
Return to original view | Post
#5
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
Nvidia to support freesync.
http://wccftech.com/nvidia-promises-support-freesync/ They should, because it just reveal that it is require freesync to change scaler, and it is basically what nvidia did. They have no reason to keep g-sync asic in lcd, since most lcd maker will upgrade theirs. Nvidia thought that freesync does not require change of scaler, which AMD initially promises. It turns out, it does. |
|
|
Nov 6 2014, 02:28 PM
Return to original view | Post
#6
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
QUOTE(trent666 @ Nov 6 2014, 07:44 AM) This new offer by Nividia and Ubisoft is SHIT. it is only for NEW purchases at SELECTED RETAILERS in SELECTED COUNTRIES. 280X? The 970 similarly perform with 290x with less power and less noise.http://redeem.nvidia.com/ doesn't even recognize Malaysia's existence. ***! AMD's Never Settle Bundle was much better. FML. If this offer by Nvidia+Ubishit never came out, I would have never had second thoughts about buying a GTX970 over a 280x. |
|
|
Nov 11 2014, 12:45 PM
Return to original view | Post
#7
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
3,345 posts Joined: Jan 2003 |
QUOTE(trent666 @ Nov 11 2014, 11:41 AM) I tested Witcher 2 last night. Set on ULTRA. My Gainward Phantom 970 hit a about 110-130fps in game, but GPU cooking at 80c! During cutscenes, screen tearing like <<i'm a naughty boy who just used a naughty word>>, coz running at almost 1000 fps/64c. (I use Fraps and OpenHardwareMonitor) On v-sync or buy g-sync monitor.Dear sifus, 1. is there a way I can play at 60fps (monitor 60hz) and full details? 2. Is 80c acceptable? Do you guys normally game at around this temp? 3. Is there a way to limit FPS using Nividia Experience or Nvidia control? Thank yous. |
|
Topic ClosedOptions
|
| Change to: | 0.0554sec
0.43
7 queries
GZIP Disabled
Time is now: 10th December 2025 - 03:02 PM |