Welcome Guest ( Log In | Register )

Bump Topic Topic Closed RSS Feed
6 Pages < 1 2 3 4 5 > » Bottom

Outline · [ Standard ] · Linear+

 NVIDIA GeForce Community V15 (new era pascal), ALL HAIL NEW PASCAL KING GTX1080 out now

views
     
Moogle Stiltzkin
post Aug 7 2015, 09:27 PM

Look at all my stars!!
*******
Senior Member
4,453 posts

Joined: Jan 2003
QUOTE(crash123 @ Aug 7 2015, 08:50 PM)
lol u dont get the point that i want to make. I say, when Pascal refresh, PCIEx16 Gen 4, Skylake refresh, cheap DDR 4 and NVMe is release, that is the year i will upgrade my rig. Btw PCIEx16 Gen 4 is release next year or 2017

PCIE x16 Gen 3.0 = 8 GT/s bit rate, doubling the lane bandwidth compare to PCI Express 2.0
PCIE x16 Gen 4.0 = 16 GT/s bit rate , doubling the lane bandwidth compare to PCI Express 3.0

so in Gen4 u can tri sli and got same bandwith as PCIE x16 3.0  drool.gif
*
gen4 ? oo didn't notice that doh.gif

so how much gt/s does the current 980ti pump out atm doh.gif ?

oh nm found it for titanx
http://www.tomshardware.co.uk/nvidia-gefor...view-33214.html

Texture Fillrate 192 GT/s

memory transfer rate 7 GT/s


so even gen 4 isn't going to be enough ?

QUOTE
Basically, NVLink provides a bigger pipe between the GPU and the CPU, and therefore, a much bigger data path, at least by today’s (or the immediate future’s) standards. For example, PCIe 3.0 transfers data at an impressive 8 gigatransfers per second (GTs), while Nvidia’s NVLink is expected to move data at about 20GTs, which is over twice as fast. 


so that nvlink 20 gt/s is needed even for single gpus ? hmm.gif

This post has been edited by Moogle Stiltzkin: Aug 7 2015, 09:29 PM
Moogle Stiltzkin
post Aug 7 2015, 10:19 PM

Look at all my stars!!
*******
Senior Member
4,453 posts

Joined: Jan 2003
QUOTE(crash123 @ Aug 7 2015, 09:44 PM)
Faster transfer rate is more to multi gpu user. For single gpu user its will not give to much benefit in performance

read this article to learn more  thumbup.gif
*
will do tx notworthy.gif
Moogle Stiltzkin
post Aug 8 2015, 06:55 PM

Look at all my stars!!
*******
Senior Member
4,453 posts

Joined: Jan 2003
some blockbuster titles based on nvidia gameworks



Moogle Stiltzkin
post Aug 9 2015, 02:14 PM

Look at all my stars!!
*******
Senior Member
4,453 posts

Joined: Jan 2003
QUOTE(Minecrafter @ Aug 8 2015, 10:17 PM)
I'll rage quit if EA use Gameworks for FIFA...
*
well i just recently got...... *cough lords of the fallen which is a heavily gameworks developed game.

Seemed pretty decent graphics wise hmm.gif

Though oddly i had an issue on windows 10 x64 that i couldn't play it unless i set physx to cpu only in nvidia panel settings. otherwise it would crash on game save load.

anyways the game will randomly crash. seems not to be too stable on windows 10 hmm.gif



QUOTE(SSJBen @ Aug 9 2015, 02:03 PM)
MYR 5.0 to USD1 that time... whistling.gif
*


if someone could edit that song to say have to buy amd card instead doh.gif lelz...

This post has been edited by Moogle Stiltzkin: Aug 9 2015, 02:17 PM
Moogle Stiltzkin
post Aug 9 2015, 02:56 PM

Look at all my stars!!
*******
Senior Member
4,453 posts

Joined: Jan 2003
tweakguide updated their nvidia graphics settings guide here
http://www.tweakguides.com/NVFORCE_1.html


very useful tips like
QUOTE
For example, if you enable MFAA in the NVCP, then select 2x MSAA in a game, you will get the equivalent of 4x MSAA quality without any extra drop in performance; set 4x MSAA in the game and MFAA will convert it to 8x MSAA quality for free, and so on.


QUOTE
I recommend that Power Management Mode be set to Adaptive under Global Settings. For any games for which you believe your GPU is constantly downclocking, you can change this setting to Prefer Maximum Performance under the Program Settings tab to ensure the highest possible clock rates at all times. Remember that this setting only relates to games and other 3D applications, not to 2D applications or the Windows Desktop. Note also that if you run a multi-monitor and/or high refresh rate display your idle clocks may be slightly higher regardless of this setting, which is normal.


QUOTE
It is recommended that Texture Filtering - Quality be set to High Quality on medium and high-end systems, and High Performance on low-end systems under Global Settings. For particular games where you have performance to spare, you can select High Quality, and for those which are more strenuous, you can select High Performance under Program Settings as required. I can see no real reason to bother with using the Performance or Quality options for this setting, given the performance and image quality difference is extremely small even at the extremes of High Quality and High Performance. It's best just to use High Quality if you prefer the highest image quality, or High Performance if you prefer a potential performance boost. Additionally, there's no need to adjust the Texture Filtering - Anisotropic Sample Optimization and Texture Filtering - Trilinear Optimization settings separately; use this setting as your primary control over texture filtering and allow those to be adjusted automatically by this setting.
QUOTE
There is no simple solution when it comes to VSync. Whether you enable it, disable it, use Adaptive VSync, or set an FPS limit, there are always some compromises involved. The only no-compromises solution is to purchase a G-Sync capable monitor, which is worth considering the next time you want to replace your display.


smile.gif


by the way just wondering, for AF do you all do global ? cause seems like then i'd have to set manually for the game disable af if it has in options hmm.gif so isn't just simpler leave 3d app, and only if the game doesn't have the option then manually set for that app ?

This post has been edited by Moogle Stiltzkin: Aug 9 2015, 05:50 PM
Moogle Stiltzkin
post Aug 10 2015, 10:06 AM

Look at all my stars!!
*******
Senior Member
4,453 posts

Joined: Jan 2003
QUOTE(kmarc @ Aug 10 2015, 09:25 AM)
After contemplating between a 390x and GTX980, finally decided to go for the GTX980. Granted that 390x is way cheaper and almost on par with 980 but I decided for the 980 due to lower power consumption and less heat. GTX980's overclocking headroom is also better but I don't normally overclock my parts nowadays except for testing..... old already....  biggrin.gif

Nice to be able to easily plop in the card, replacing my MSI GTX760, with the same 8-pin + 6-pin PCI-E connectors. Also great to see that the power consumption is about 190w (based on reviews) as compared to 165w on the old MSI.

Luckily my coolermaster 690 casing was able to accommodate the card, only with less than 2cm to spare!

And so, my first high-end card after using mid-end cards all my life!!!! (AMD 9600 pro, 7800 gs, 8800gts, 8800gt, GTX260, GTX460)

Now thinking whether I should go 1440p  sweat.gif (gaming on 1080p all my life too!)
*
tx notworthy.gif
Moogle Stiltzkin
post Aug 10 2015, 01:50 PM

Look at all my stars!!
*******
Senior Member
4,453 posts

Joined: Jan 2003
QUOTE(kmarc @ Aug 10 2015, 12:02 PM)
Why  notworthy.gif ? Too expensive is it?  biggrin.gif

If I'm in KL could have bought/COD it from LYN forumers for RM500 cheaper!!!!  vmad.gif  mad.gif  vmad.gif
*
hm i replied wrong post. was referring to sjj laugh.gif

anyway 1440p fps ultra seems doable even on 980ti. probably more so with pascal so why not doh.gif

question is will you go for a acer predator gsync ips monitor :] ? 144hz gsync ips. rm2xxx

This post has been edited by Moogle Stiltzkin: Aug 10 2015, 01:51 PM
Moogle Stiltzkin
post Aug 10 2015, 04:58 PM

Look at all my stars!!
*******
Senior Member
4,453 posts

Joined: Jan 2003
QUOTE(kmarc @ Aug 10 2015, 02:37 PM)
OMG! 1440p my wife will already strangle me. 1440p gsync..... go straight 6-feet under!  biggrin.gif

Actually, not willing to spend money for premium gsync tech at the moment.... hopefully next time when the prices comes down. smile.gif
actually if you already have a nvidia gpu, then you've got your foot halfway into getting gsync capability.

hm... i think that monitor will be in the low 3k range now from what i now heard sad.gif


Anyway while your waiting for price to drop. i suspect the next thing to come out will be quantum dot film which they will layer on the panel. Will give better colors. Is the next thing over the horizon :}

This post has been edited by Moogle Stiltzkin: Aug 10 2015, 05:06 PM
Moogle Stiltzkin
post Aug 10 2015, 06:56 PM

Look at all my stars!!
*******
Senior Member
4,453 posts

Joined: Jan 2003
QUOTE(skylinelover @ Aug 10 2015, 05:12 PM)
Haha TN panel i dont like laugh.gif doh.gif after hopping in IPS
*
i wonder if the latencies between this acer and a asus rog for example is that big a difference for a fps gamer hmm.gif i doubt it.

Moogle Stiltzkin
post Aug 11 2015, 09:52 AM

Look at all my stars!!
*******
Senior Member
4,453 posts

Joined: Jan 2003
QUOTE(shikimori @ Aug 10 2015, 09:53 PM)
seems like people preferred G-sync over freesync on a blindtest . Perhaps the limitation on the MG279q 35-90hz ruining the freesync experience

user posted image

http://www.tomshardware.com/reviews/amd-fr...event,4246.html

nice article to read if you are going for variable refresh rate monitor
Oh ya , btw do you guys know how long it is to get replacement GPU warranty claim for Asus ?
*
the only major difference is how it handles below it's minimum vrr window.

on paper gsync handles it all the way down to 0. whereas freesync starts having issues.

so the tests would have to focus on this area specifically hmm.gif

user posted image
QUOTE
What you're seeing here is a graphed output of how A-Sync and G-Sync behave with regards to refresh rates as framerate begins to drop to low levels.




fast forward to 50sec into the video and 6:30



freesync can achieve proper under vrr window that works just as well as nvidia. but that depends whether they will want to implement it or not. freesync has the potential to be just as good, but it's not there yet sad.gif

This post has been edited by Moogle Stiltzkin: Aug 11 2015, 10:09 AM
Moogle Stiltzkin
post Aug 12 2015, 08:52 PM

Look at all my stars!!
*******
Senior Member
4,453 posts

Joined: Jan 2003
i was thinking of getting pascal with waterblock. but looking at currency 1:4 to the usdollar, i may have to stick to stock fan cooler sad.gif

and if it gets even worse, i'd have to downgrade the gpu to the medium or lower spec version of the pascal variant shakehead.gif



This post has been edited by Moogle Stiltzkin: Aug 14 2015, 05:50 PM
Moogle Stiltzkin
post Aug 13 2015, 12:13 PM

Look at all my stars!!
*******
Senior Member
4,453 posts

Joined: Jan 2003
How much video memory is enough?
4GB versus the world
http://techreport.com/blog/28800/how-much-...emory-is-enough


bottomline
QUOTE
Of course, much of what we've just demonstrated about memory capacity constraints is kind of academic for reasons we've noted. On a practical level, these results match what we saw in our initial reviews of the R9 Fury and Fury X: at resolutions of 4K and below, cards with 4GB of video memory can generally get by just fine, even with relatively high image quality settings. Similarly, the GeForce GTX 970 seems to handle 4K gaming quite well in spite of its funky partitioned memory. Meanwhile, at higher resolutions, no current single-GPU graphics card is fast enough for fluid gaming, no matter how much memory it might have. Even with 12GB, the Titan X averages less than 30 FPS in Shadow of Mordor at 5760x3240.


QUOTE
The biggest concern, though, is future games that simply require more memory due to the use of higher-quality textures and other assets. AMD has a bit of a challenge to manage, and it will likely need to tune its driver software carefully during the Fury's lifetime in order to prevent occasional issues. Here's hoping that work is effective.


some games already 20gb + rclxub.gif so those flash drives need to start gaining more capacity on the cheaper before get too big to fit sweat.gif games on regular hdd is no longer good enough sad.gif

QUOTE
When or if I buy Wolfenstein: The New Order this year, it will probably be the physical PC version because I don’t want to have to download 50GB on my internet connection and monthly data cap. When I download a game on Steam my maximum download speed is around 1.5 Megabytes per second. Installing a 20GB game usually takes me six hours. If Call of Duty Ghosts — somewhere north of 40GB, shows up on a Steam free weekend, I’d have to spend 12 of those free hours downloading the game. On that note maybe EA’s Origin Game Time is a better take on the free trial idea since it doesn’t start your clock until you’ve actually installed the game.


QUOTE
Developers should use less pre-rendered FMVs for one thing. The PC version of Metal Gear Rising: Revengeance is nearly 25GB, but as I understand it around 18 of those gigs are consumed by FMVs. The actual game is somewhere around 5GB. The recent PC release of Final Fantasy XIII weighs in at 60GB, with FMVs accounting for 46GB.

FMVs made sense back in the 90’s when real-time video game graphics couldn’t display events in storylines as convincingly. Pre-rendered CG graphics are always a generation ahead of video game graphics, but I’d say real-time graphics have gotten good enough to fully convey storylines. Plus, they mesh better with the actual gameplay. FMVs have also only accelerated their file size increases with the move to encoding them in 1080p.

Somewhat odd are games that use FMVs that are pre-rendered with the same graphics as gameplay. Maybe things are rendered in those cut scenes that the gameplay engine can’t handle or doesn’t need, but I think developers should still try if it can make the difference in file sizes. Can some developers do a better job of compressing the video files if FMVs are unavoidable? Maybe some games should do a better job of storytelling that conveys more through gameplay and less through cut scenes and FMVs.


QUOTE
Then you have asset quality. Titanfall is 48GB on PC because it uses uncompressed audio which is easier on dual core processors. In addition to choice of languages, why don’t they give players a choice on whether uncompressed audio matters to them. The same goes for textures. Some pirate versions of Max Payne 3 come with only one audio language or with compressed textures to cut down on the download size. I wonder how many customers would be receptive to official distributors doing the same thing.


http://venturebeat.com/community/2014/10/1...ng-out-of-hand/



i don't agree that they should use less fmvs. i rather they just increase flash drive capacities more cost effectively to accomodate future gaming.

maybe something like intel's x-point will save the day icon_idea.gif


think the first product will be in 2016 hmm.gif
user posted image


This post has been edited by Moogle Stiltzkin: Aug 13 2015, 12:34 PM
Moogle Stiltzkin
post Aug 13 2015, 07:07 PM

Look at all my stars!!
*******
Senior Member
4,453 posts

Joined: Jan 2003
QUOTE(SSJBen @ Aug 13 2015, 03:14 PM)
Witcher 3 had very, very, little FMVs. The game is only 30GB+ including the latest patches yet is one of the biggest games in recent history. Yet, it has some of the most terrible animation rigging in a AAA game, ever. doh.gif

FMVs would have helped Witcher 3 immensely to be honest.

CDPR does not have Fox Engine, just sayin'.
*
hm... probably still better than command and conquer generals. they just cut out fmvs altogether. cut cost on actors sad.gif

and sadly even the last tiberium series title was shit. kanes acting alone could not save the rest of poor fmvs to do his acting any justice doh.gif

in diablo 3 which i closed beta tested while back, they didnd't do fmv for everything. some scenes use ingame character to act out the scenes. another way to reduce fmvs.... cost cutting i guess. or maybe they didn't want scenes to be in parts of the game where it needlessly ruins the pacing of the game scenes hmm.gif

This post has been edited by Moogle Stiltzkin: Aug 13 2015, 07:27 PM
Moogle Stiltzkin
post Aug 14 2015, 04:11 PM

Look at all my stars!!
*******
Senior Member
4,453 posts

Joined: Jan 2003
QUOTE(skylinelover @ Aug 13 2015, 07:22 PM)
Haha. I still think red alert 2 has the best FMV ever and nothing has beat it since yo. laugh.gif rclxms.gif
*
best story narration definitely final fantasy 7. fmvs might not be up to todays standard, but come reboot who knows. but modern fmv + ff7 reboot = killer game icon_idea.gif
Moogle Stiltzkin
post Aug 18 2015, 12:17 PM

Look at all my stars!!
*******
Senior Member
4,453 posts

Joined: Jan 2003
the first dx12 game as far i know
Ashes of the Singularity
http://www.extremetech.com/gaming/212314-d...go-head-to-head


:/ is the game any fun ?


Moogle Stiltzkin
post Aug 20 2015, 05:54 PM

Look at all my stars!!
*******
Senior Member
4,453 posts

Joined: Jan 2003
QUOTE
Intel has, according to The Tech Report, decided to support Adaptive-Sync -- but not necessarily in their current product line. David Blythe of Intel would not comment on specific dates or release windows, just that it is in their plans. This makes sense for Intel because it allows their customers to push settings higher while maintaining a smooth experience, which matters a lot for users of integrated graphics.

While “AMD FreeSync” is a stack of technologies, VESA DisplayPort Adaptive-Sync should be all that is required on the monitor side. This should mean that Intel has access to all of AMD's adaptive refresh monitors, although the driver and GPU circuitry would need to be their burden. G-Sync monitors (at least those with NVIDIA-design modules -- this is currently all of them except for one laptop I think) would be off limits, though.




QUOTE
This is BIG. Intel recently bought Alterra the people who make the G-Sync module for Nvidia. They also cross-license a lot of Nvidia GPUs tech.

They would have more insight to the future viability of G-Sync than anyone aside from Nvidia themselves and to decide to go the AMD route.

THATS BIG!!!


QUOTE
That is a bit misleading. Alterra just makes FPGAs. They may not have any actual data on how Nvidia's g-sync module works unless Nvidia sent them the design for debugging or something. Using an FPGA is kind of like using a CPU that you then need to write software for. If I buy a CPU, the maker of that CPU doesn'tknow what software I run on it. FPGAs are programmed using a hardware description language like verilog rather than a software programming language. Alterra doesn't necessarily have access to the verilog that Nvidia uses to program the FPGA.

If Nvidia is confident that there will be a larger volume of g-sync modules sold then they can actually use the verilog design to create a fixed function ASIC. This should be much cheaper, if there is sufficient volume. I tried to find out the price of the FPGA Nvidia is using, and it looked like it was around $200 in small volumes, if I had the right one. Nvidia would get a better price for a large number of parts though. I don't know who takes the FPGA and mounts it on a board to make the actual g-sync module. Nvidia probably just contracts this out to some other company.



any thoughts on this ?

would it be possible for a gsync like performance but via the intel adaptive-sync plan ?
http://www.pcper.com/news/Graphics-Cards/I...e-Sync#comments

This post has been edited by Moogle Stiltzkin: Aug 20 2015, 05:57 PM
Moogle Stiltzkin
post Aug 21 2015, 07:22 PM

Look at all my stars!!
*******
Senior Member
4,453 posts

Joined: Jan 2003
Samsung Enters The HBM Market In 1H 2016 – HPC and GPU Ready HBM With Up to 1.5 TB/s Bandwidth and 48 GB VRAM
QUOTE
Starting in 2016, the first markets that Samsung will focus towards will include the HPC and graphics department. Samsung has a wide array of HBM configurations pre-planned.

Each HBM stack will be made from a single 8Gb component and range down to several tiers of HBM SKUs. The entry level models include the 2-Hi DRAM model that will be integrated on mainstream 2 GB HBM graphics cards (256 GB/s), performance based graphics with 4 GB HBM (512 GB/s). The Enthusiast graphics cards will ship with 4-Hi DRAM with 2 HBM stacks that will allow 8 GB VRAM (512 GB/s) and finally, 4 HBM Stacks with 16 GB VRAM models (1 TB/s).

On the HPC front, there are a wide array of high bandwidth and dense memory designs that include 4-Hi DRAMs with 4 HBM stacks that feature 32 GB VRAM (1 TB/s) and the bulky, 8-Hi DRAMs configured in 6 HBM stacks with 24 GB and 48 GB VRAM, both models featuring 1.5 TB/s bandwidth. There are also some network oriented HBM SKUs which are planned for launch in 2017 with 8-Hi DRAM Stacks configured in 1-2 HBM chips. In 2018, Samsung wants to focus on increase market growth by entering new applications to incorporate their HBM designs.


http://wccftech.com/samsung-enters-hbm-mar...dth-48-gb-vram/

is it a stretch then to guess now what hbm capacity pascal will have ? hmm.gif

i think pascal said 1tb/s.... so doesn't that then mean 16gb vram ?? drool.gif

but will they be sourcing from samsung ? hmm.gif also is this same product lineup more or less same as others, meaning that 1tb/s will definitely be 16gb vram ?

This post has been edited by Moogle Stiltzkin: Aug 21 2015, 07:27 PM
Moogle Stiltzkin
post Aug 24 2015, 11:01 AM

Look at all my stars!!
*******
Senior Member
4,453 posts

Joined: Jan 2003
QUOTE(crash123 @ Aug 23 2015, 10:10 PM)
Anyone know where I can find i7-6700k, IdealTech say the stock will arrived in 2 weeks but i see some people already sell their i5-6600k
*
doh.gif

Moogle Stiltzkin
post Aug 24 2015, 10:25 PM

Look at all my stars!!
*******
Senior Member
4,453 posts

Joined: Jan 2003
QUOTE(crash123 @ Aug 24 2015, 04:14 PM)
Ok.jpeg sad.gif
*
is okay... we all noob at one point smile.gif
Moogle Stiltzkin
post Aug 28 2015, 11:29 AM

Look at all my stars!!
*******
Senior Member
4,453 posts

Joined: Jan 2003
QUOTE(eatsleepnDIE @ Aug 27 2015, 12:10 PM)
yep, going be there lol
*
me2 smile.gif i'm calling dibs on the monitor flex.gif

ben you going ?

This post has been edited by Moogle Stiltzkin: Aug 28 2015, 12:24 PM

6 Pages < 1 2 3 4 5 > » Top
Topic ClosedOptions
 

Change to:
| Lo-Fi Version
0.0595sec    0.42    7 queries    GZIP Disabled
Time is now: 30th November 2025 - 07:30 AM