Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed
125 Pages « < 18 19 20 21 22 > » Bottom

Outline · [ Standard ] · Linear+

 NVIDIA GeForce Community V15 (new era pascal), ALL HAIL NEW PASCAL KING GTX1080 out now

views
     
eatsleepnDIE
post Aug 12 2015, 12:17 PM

Getting Started
**
Junior Member
168 posts

Joined: Nov 2007


yeah me too. im still gaming on 1080p monitor but with 120hz refresh rate. trying to achieve that while playing the witcher 3 and failed miserably lol
TSskylinelover
post Aug 12 2015, 12:28 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,245 posts

Joined: Jul 2005
Woohoo. CGPA 4 ringgit equals 1 USD. At this rate, i think i rather jump 2 volta in 2017. It will need some sort of miracles 2 ever get back 2 zaman 2.8 equals a dollar. laugh.gif doh.gif
SUSTheHitman47
post Aug 12 2015, 01:35 PM

Nuke
******
Senior Member
1,053 posts

Joined: Sep 2009
From: In Your Mind



QUOTE(skylinelover @ Aug 12 2015, 12:28 PM)
Woohoo. CGPA 4 ringgit equals 1 USD. At this rate, i think i rather jump 2 volta in 2017. It will need some sort of miracles 2 ever get back 2 zaman 2.8 equals a dollar. laugh.gif doh.gif
*
i feels like it was last week our 2.8 = 1dollar. cry.gif

i dont know if i can still continue with my plan changing to itx.
SSJBen
post Aug 12 2015, 05:47 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(skylinelover @ Aug 12 2015, 12:28 PM)
Woohoo. CGPA 4 ringgit equals 1 USD. At this rate, i think i rather jump 2 volta in 2017. It will need some sort of miracles 2 ever get back 2 zaman 2.8 equals a dollar. laugh.gif doh.gif
*
Lol 2017? We should plan to refuge by then instead if nothing changes over the next year.
Najmods
post Aug 12 2015, 06:02 PM

*mutter mutter mutter mutter*
*******
Senior Member
5,211 posts

Joined: Feb 2005
From: Konohana


QUOTE(clawhammer @ Aug 12 2015, 11:08 AM)
Trust me, you start a poll or do a survey outside Low Yat plaza. I'm pretty sure not many would want to play at 20-30 FPS (Seriously, 20 FPS?! shocking.gif)

You certainly know very interesting people which can game with 9FPS or lower so I need to salute you for that.
*

Yep, I see my cousin play Red Alert 2 at merely 2-5fps with Pentium 233MHz MMX, but you'd be surprised how massive his base is. I don't know how he got the patience playing it. Even I once play at that speed as well but I can't handle it (played on Pentium 166MHz). From loading screen to build first power plant takes 15 minutes. No joke.

QUOTE(clawhammer @ Aug 12 2015, 11:08 AM)
I mention PC is an expensive hobby so I'm not sure when did I state that RM3K is cheap. There's no more RM1K budget rig around seriously. Get an i3, RAM, board and that is already close to RM1K. You mean the GPU, PSU, casing, HDD, keyboard, mouse, LCD can be bought for another RM200?
I merely judging from your rig, and the stuff you sold, I actually wanted to buy your CM PSU but maybe next month tongue.gif

Nope, been asked for that kind of budget for laptop for gaming, PC for gaming etc but I can't recommend them any since there isn't one fit for gaming sweat.gif At a minimum it's always at least RM1.5k, squeezing an APU could get at that budget, with cheap LCD and generic case/PSU/keyboard + mice combo.

When I see people asking for something like 'lower than RM500' then I saw someone posted 'top up RM100 or RM200 get these' I'm not really agreed with that unless the OP stated he can go that far, because some people even RM50 means a lot to them.
Moogle Stiltzkin
post Aug 12 2015, 08:52 PM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
i was thinking of getting pascal with waterblock. but looking at currency 1:4 to the usdollar, i may have to stick to stock fan cooler sad.gif

and if it gets even worse, i'd have to downgrade the gpu to the medium or lower spec version of the pascal variant shakehead.gif



This post has been edited by Moogle Stiltzkin: Aug 14 2015, 05:50 PM
TSskylinelover
post Aug 13 2015, 01:44 AM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,245 posts

Joined: Jul 2005
QUOTE(TheHitman47 @ Aug 12 2015, 01:35 PM)
i feels like it was last week our 2.8 = 1dollar.  cry.gif

i dont know if i can still continue with my plan changing to itx.
*
I mean american dollars laugh.gif

Think was 2 years back i buy hell a lot of goodies from ebay with 2.8 per american dollars including my GPU hahahaha rclxms.gif

You should abort paln and refuge elsewhere doh.gif

QUOTE(Moogle Stiltzkin @ Aug 12 2015, 08:52 PM)
i was thinking of getting pascal with waterblock. but looking at currency 1:4 to the usdollar, i may have to stick to stock fan cooler sad.gif

and if it gets even worse, i'd have to downgrade the gpu to the medium or lower spec version of the pascal variant  shakehead.gif


*
I feel your pain brah doh.gif doh.gif
Moogle Stiltzkin
post Aug 13 2015, 12:13 PM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
How much video memory is enough?
4GB versus the world
http://techreport.com/blog/28800/how-much-...emory-is-enough


bottomline
QUOTE
Of course, much of what we've just demonstrated about memory capacity constraints is kind of academic for reasons we've noted. On a practical level, these results match what we saw in our initial reviews of the R9 Fury and Fury X: at resolutions of 4K and below, cards with 4GB of video memory can generally get by just fine, even with relatively high image quality settings. Similarly, the GeForce GTX 970 seems to handle 4K gaming quite well in spite of its funky partitioned memory. Meanwhile, at higher resolutions, no current single-GPU graphics card is fast enough for fluid gaming, no matter how much memory it might have. Even with 12GB, the Titan X averages less than 30 FPS in Shadow of Mordor at 5760x3240.


QUOTE
The biggest concern, though, is future games that simply require more memory due to the use of higher-quality textures and other assets. AMD has a bit of a challenge to manage, and it will likely need to tune its driver software carefully during the Fury's lifetime in order to prevent occasional issues. Here's hoping that work is effective.


some games already 20gb + rclxub.gif so those flash drives need to start gaining more capacity on the cheaper before get too big to fit sweat.gif games on regular hdd is no longer good enough sad.gif

QUOTE
When or if I buy Wolfenstein: The New Order this year, it will probably be the physical PC version because I don’t want to have to download 50GB on my internet connection and monthly data cap. When I download a game on Steam my maximum download speed is around 1.5 Megabytes per second. Installing a 20GB game usually takes me six hours. If Call of Duty Ghosts — somewhere north of 40GB, shows up on a Steam free weekend, I’d have to spend 12 of those free hours downloading the game. On that note maybe EA’s Origin Game Time is a better take on the free trial idea since it doesn’t start your clock until you’ve actually installed the game.


QUOTE
Developers should use less pre-rendered FMVs for one thing. The PC version of Metal Gear Rising: Revengeance is nearly 25GB, but as I understand it around 18 of those gigs are consumed by FMVs. The actual game is somewhere around 5GB. The recent PC release of Final Fantasy XIII weighs in at 60GB, with FMVs accounting for 46GB.

FMVs made sense back in the 90’s when real-time video game graphics couldn’t display events in storylines as convincingly. Pre-rendered CG graphics are always a generation ahead of video game graphics, but I’d say real-time graphics have gotten good enough to fully convey storylines. Plus, they mesh better with the actual gameplay. FMVs have also only accelerated their file size increases with the move to encoding them in 1080p.

Somewhat odd are games that use FMVs that are pre-rendered with the same graphics as gameplay. Maybe things are rendered in those cut scenes that the gameplay engine can’t handle or doesn’t need, but I think developers should still try if it can make the difference in file sizes. Can some developers do a better job of compressing the video files if FMVs are unavoidable? Maybe some games should do a better job of storytelling that conveys more through gameplay and less through cut scenes and FMVs.


QUOTE
Then you have asset quality. Titanfall is 48GB on PC because it uses uncompressed audio which is easier on dual core processors. In addition to choice of languages, why don’t they give players a choice on whether uncompressed audio matters to them. The same goes for textures. Some pirate versions of Max Payne 3 come with only one audio language or with compressed textures to cut down on the download size. I wonder how many customers would be receptive to official distributors doing the same thing.


http://venturebeat.com/community/2014/10/1...ng-out-of-hand/



i don't agree that they should use less fmvs. i rather they just increase flash drive capacities more cost effectively to accomodate future gaming.

maybe something like intel's x-point will save the day icon_idea.gif


think the first product will be in 2016 hmm.gif
user posted image


This post has been edited by Moogle Stiltzkin: Aug 13 2015, 12:34 PM
SSJBen
post Aug 13 2015, 03:14 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


Witcher 3 had very, very, little FMVs. The game is only 30GB+ including the latest patches yet is one of the biggest games in recent history. Yet, it has some of the most terrible animation rigging in a AAA game, ever. doh.gif

FMVs would have helped Witcher 3 immensely to be honest.

CDPR does not have Fox Engine, just sayin'.
Moogle Stiltzkin
post Aug 13 2015, 07:07 PM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE(SSJBen @ Aug 13 2015, 03:14 PM)
Witcher 3 had very, very, little FMVs. The game is only 30GB+ including the latest patches yet is one of the biggest games in recent history. Yet, it has some of the most terrible animation rigging in a AAA game, ever. doh.gif

FMVs would have helped Witcher 3 immensely to be honest.

CDPR does not have Fox Engine, just sayin'.
*
hm... probably still better than command and conquer generals. they just cut out fmvs altogether. cut cost on actors sad.gif

and sadly even the last tiberium series title was shit. kanes acting alone could not save the rest of poor fmvs to do his acting any justice doh.gif

in diablo 3 which i closed beta tested while back, they didnd't do fmv for everything. some scenes use ingame character to act out the scenes. another way to reduce fmvs.... cost cutting i guess. or maybe they didn't want scenes to be in parts of the game where it needlessly ruins the pacing of the game scenes hmm.gif

This post has been edited by Moogle Stiltzkin: Aug 13 2015, 07:27 PM
TSskylinelover
post Aug 13 2015, 07:22 PM

Future Crypto Player😄👊Driver Abamsado😎😎
********
All Stars
11,245 posts

Joined: Jul 2005
Haha. I still think red alert 2 has the best FMV ever and nothing has beat it since yo. laugh.gif rclxms.gif
eatsleepnDIE
post Aug 13 2015, 11:19 PM

Getting Started
**
Junior Member
168 posts

Joined: Nov 2007


QUOTE(skylinelover @ Aug 13 2015, 07:22 PM)
Haha. I still think red alert 2 has the best FMV ever and nothing has beat it since yo. laugh.gif rclxms.gif
*
Seconded this..that russian babe was hot!
Moogle Stiltzkin
post Aug 14 2015, 04:11 PM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE(skylinelover @ Aug 13 2015, 07:22 PM)
Haha. I still think red alert 2 has the best FMV ever and nothing has beat it since yo. laugh.gif rclxms.gif
*
best story narration definitely final fantasy 7. fmvs might not be up to todays standard, but come reboot who knows. but modern fmv + ff7 reboot = killer game icon_idea.gif
Moogle Stiltzkin
post Aug 18 2015, 12:17 PM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
the first dx12 game as far i know
Ashes of the Singularity
http://www.extremetech.com/gaming/212314-d...go-head-to-head


:/ is the game any fun ?


SSJBen
post Aug 18 2015, 10:47 PM

Stars deez nuts.
*******
Senior Member
4,522 posts

Joined: Apr 2006


QUOTE(Moogle Stiltzkin @ Aug 18 2015, 12:17 PM)
the first dx12 game as far i know
Ashes of the Singularity
http://www.extremetech.com/gaming/212314-d...go-head-to-head
:/ is the game any fun ?
*
No, it's a boring as fudge RTS. It's also terribly unbalanced at the moment.
antaras
post Aug 19 2015, 10:56 AM

Getting Started
**
Junior Member
196 posts

Joined: Jan 2010
From: Kuala Lumpur


QUOTE(SSJBen @ Aug 18 2015, 10:47 PM)
No, it's a boring as fudge RTS. It's also terribly unbalanced at the moment.
*
Will need to see if new drivers going to improve the performance or not. Currently, AMD is kicking ass with DX12. Then again, it's only ONE benchmark. In anycase, still very interesting to see. rclxms.gif
SHOfrE3zE
post Aug 20 2015, 09:30 AM

Drop It Like It's Hot
******
Senior Member
1,895 posts

Joined: Jan 2003
From: Shah Alam


guys anyone having problem running Geforce Experience on Windows 10?
It was fine before but now it won't even launch and stuck on the loading screen when i execute the program.

Tried uninstall and install back but the problem persist.
Moogle Stiltzkin
post Aug 20 2015, 05:54 PM

Look at all my stars!!
*******
Senior Member
4,454 posts

Joined: Jan 2003
QUOTE
Intel has, according to The Tech Report, decided to support Adaptive-Sync -- but not necessarily in their current product line. David Blythe of Intel would not comment on specific dates or release windows, just that it is in their plans. This makes sense for Intel because it allows their customers to push settings higher while maintaining a smooth experience, which matters a lot for users of integrated graphics.

While “AMD FreeSync” is a stack of technologies, VESA DisplayPort Adaptive-Sync should be all that is required on the monitor side. This should mean that Intel has access to all of AMD's adaptive refresh monitors, although the driver and GPU circuitry would need to be their burden. G-Sync monitors (at least those with NVIDIA-design modules -- this is currently all of them except for one laptop I think) would be off limits, though.




QUOTE
This is BIG. Intel recently bought Alterra the people who make the G-Sync module for Nvidia. They also cross-license a lot of Nvidia GPUs tech.

They would have more insight to the future viability of G-Sync than anyone aside from Nvidia themselves and to decide to go the AMD route.

THATS BIG!!!


QUOTE
That is a bit misleading. Alterra just makes FPGAs. They may not have any actual data on how Nvidia's g-sync module works unless Nvidia sent them the design for debugging or something. Using an FPGA is kind of like using a CPU that you then need to write software for. If I buy a CPU, the maker of that CPU doesn'tknow what software I run on it. FPGAs are programmed using a hardware description language like verilog rather than a software programming language. Alterra doesn't necessarily have access to the verilog that Nvidia uses to program the FPGA.

If Nvidia is confident that there will be a larger volume of g-sync modules sold then they can actually use the verilog design to create a fixed function ASIC. This should be much cheaper, if there is sufficient volume. I tried to find out the price of the FPGA Nvidia is using, and it looked like it was around $200 in small volumes, if I had the right one. Nvidia would get a better price for a large number of parts though. I don't know who takes the FPGA and mounts it on a board to make the actual g-sync module. Nvidia probably just contracts this out to some other company.



any thoughts on this ?

would it be possible for a gsync like performance but via the intel adaptive-sync plan ?
http://www.pcper.com/news/Graphics-Cards/I...e-Sync#comments

This post has been edited by Moogle Stiltzkin: Aug 20 2015, 05:57 PM
cstkl1
post Aug 20 2015, 07:54 PM

Look at all my stars!!
Group Icon
Elite
6,799 posts

Joined: Jan 2003

QUOTE(Moogle Stiltzkin @ Aug 20 2015, 05:54 PM)
any thoughts on this ?

would it be possible for a gsync like performance but via the intel adaptive-sync plan ?
http://www.pcper.com/news/Graphics-Cards/I...e-Sync#comments
*
Third quote is correct.

Intel wont have the codes written unto the fpga.
N nvidia coding is different for each panel which itself has a qc line for rejection.
Come to think it.. Lol is freesync minitors mixed with the rejected panels that couldnt pass gsync test??


Desprado
post Aug 20 2015, 11:20 PM

Getting Started
**
Junior Member
258 posts

Joined: Feb 2012
Dam i bought MSI GTX 980 again.

It has 80.4 asiq quality and Elpida Vram

I am surprised that i am running this card @1545Mhz and memory clock 3900mhz (7800mhz).

I can even push more further the vram but it is Elpida so i am scared to that.

125 Pages « < 18 19 20 21 22 > » Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0373sec    0.74    6 queries    GZIP Disabled
Time is now: 2nd December 2025 - 05:45 AM