is my current i7-3770 be good enough for the ASUS GTX1080 StriX OC 8GB DDR5X 256BIT i preordered ? or will it bottleneck :/
NVIDIA GeForce Community V16 (welcum pascal), ALL HAIL NEW PASCAL KING GTX1080 out now
NVIDIA GeForce Community V16 (welcum pascal), ALL HAIL NEW PASCAL KING GTX1080 out now
|
|
Jul 14 2016, 04:59 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,475 posts Joined: Jan 2003 |
is my current i7-3770 be good enough for the ASUS GTX1080 StriX OC 8GB DDR5X 256BIT i preordered ? or will it bottleneck :/
|
|
|
|
|
|
Jul 14 2016, 05:06 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
9,132 posts Joined: Aug 2005 |
QUOTE(napster2142 @ Jul 14 2016, 04:45 PM) I thought 1440p already become a standard gamer res so old title would be no problem It will still come down to game developer's optimization. For example Doom's recent update bring performance improvement, if they dont it will stay there. yeah i did saw some review that this card can run 4k smooth on med-high settings i just recognized that if i drop msaa to x4, i can enable NVIDIA txaa and my fps now in between 65-75 fps, thats okay for now i will ignore if my gpu literally screaming for CPU upgrade bold part : my 1st run all stock, FPS hardly to pass 35 fps that time and some scene will drop below 30 (msaa x8), OC helps a lot I wonder anyone with current gen high-end GPU still have Crysis ( 1st gen ), how does it run now |
|
|
Jul 14 2016, 05:20 PM
|
|
Forum Admin
44,415 posts Joined: Jan 2003 |
|
|
|
Jul 14 2016, 05:21 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,475 posts Joined: Jan 2003 |
just noticed a shop stocking on EVGA GTX1080 FTW. their the only shop i've seen that stocks on EVGA for this so far
https://www.facebook.com/IDEALTECHOLOGY# http://hexus.net/tech/reviews/graphics/934...e-gtx-1080-ftw/ https://www.reddit.com/r/nvidia/comments/4r...ng_performance/ QUOTE(goldfries @ Jul 14 2016, 05:20 PM) will do QUOTE ASUS STRIX-GTX1080-O8G-GAMING RM3779, Promo RM3499 ASUS STRIX-GTX1080-8G-GAMING RM3619, Promo RM3389 https://forum.lowyat.net/topic/3749533 FYI: ASUS GeForce GTX 1080 8GB ROG STRIX OC = (STRIX-GTX1080-O8G-GAMING) http://videocardz.com/nvidia/geforce-1000/geforce-gtx-1080 https://www.asus.com/us/Graphics-Cards/ROG-...080-O8G-GAMING/ thx to napster2142 for headsup, sadly i saw too late This post has been edited by Moogle Stiltzkin: Jul 14 2016, 05:27 PM |
|
|
Jul 14 2016, 05:22 PM
|
![]() ![]()
Junior Member
196 posts Joined: Jan 2010 From: Kuala Lumpur |
QUOTE(napster2142 @ Jul 14 2016, 04:09 PM) Hi bro, oppsss... sorry. Can't help you much on this right now. I'm not playing anything else apart from Witcher 3. Will most probably stay on with this game for some time more as they just launched Blood & Wine DLC.Wanna ask since your specs are same with, do u play GTA V? mind to share FPS? Thinking of my cpu bottleneck the gpu Anyway I'm using Strix OC GTX1070, boost to 20xxmhz, i5 3750k @ 4.2ghz but got FPS around 30-40 @ 1440p, old game shouldn't it be more FPS? Anyway, my setup is really similar. My 3570k also clocked at 4.1 or 4.2. |
|
|
Jul 14 2016, 05:22 PM
|
|
Forum Admin
44,415 posts Joined: Jan 2003 |
QUOTE(Moogle Stiltzkin @ Jul 14 2016, 05:21 PM) just noticed a shop stocking on EVGA GTX1080 FTW. their the only shop i've seen that stocks on EVGA for this so far They've been the only one since GTX 9xx series. https://www.facebook.com/IDEALTECHOLOGY# |
|
|
|
|
|
Jul 14 2016, 05:23 PM
|
![]() ![]() ![]() ![]() ![]()
Senior Member
889 posts Joined: Jun 2008 |
QUOTE(Moogle Stiltzkin @ Jul 14 2016, 04:59 PM) is my current i7-3770 be good enough for the ASUS GTX1080 StriX OC 8GB DDR5X 256BIT i preordered ? or will it bottleneck :/ I am having similar configuration to yours. My current rig is also using i7-3770. Have pre order MSI GTX1080 Gaming X, now waiting to arrive. I am also concern will it bottleneck as GTX 1080 will be running for 1440p display mode.This year no more budget, maybe next year upgrade to next intel CPU generation. This post has been edited by CPURanger: Jul 14 2016, 05:24 PM |
|
|
Jul 14 2016, 05:26 PM
|
![]() ![]()
Junior Member
97 posts Joined: May 2010 From: Johor Bahru;Shah Alam |
QUOTE(adilz @ Jul 14 2016, 04:52 PM) Ahhh 65-75 fps is more like it and within range a very good range for your setup. Those reviewers benchmark using crazy 8-Core 5960X CPUs anyway. So your OC also okay. Just asking to because I learned first hand that sometime overclocking your CPU too high can have negative effect. I oced the GPU using preset mode by Asus, quite impress because the spec said 1860mhz, but after done gaming, max reached around 20xxmhzMy i5-3570k can reach 4.2ghz @ 1.20v stable, already tried 4.4ghz @ 1.21v - game crashes and 4.5ghz @ 1.22v - bluescreen my 2 cent, once u have reached the highest freq, best voltage and stability, then run the freq a bit lower for daily usage already 3 years+ with my ivy run @ 4.2ghz daily on WC QUOTE(Moogle Stiltzkin @ Jul 14 2016, 04:59 PM) is my current i7-3770 be good enough for the ASUS GTX1080 StriX OC 8GB DDR5X 256BIT i preordered ? or will it bottleneck :/ just read your post i think it would be okay since mine 3570k can run The Witcher 3 all max. |
|
|
Jul 14 2016, 05:29 PM
|
![]() ![]() ![]() ![]()
Junior Member
500 posts Joined: Oct 2015 From: Penang |
QUOTE(Demonic Wrath @ Jul 14 2016, 07:47 AM) On NVIDIA GPUs, there's a hardware scheduler. It is the Gigathread Engine. This Gigathread Engine block is equivalent to (graphic processor + ACEs) in AMD GPUs. NVIDIA doesn't show the internals of their Gigathread Engine. In Vulkan, they have exposed 16 graphic queues and 2 copy engine in the latest driver. Drivers only compile the task lists and send to the GPU (host to device). How it distribute the tasks is managed by the hardware scheduler (Work Distributor) on the GPU. DX12 Multi engine capabilties of recent AMD and Nvidia hardware (Kepler, Maxwell v1 (750 Series) and Maxwell v2 (900 Series)) http://ext3h.makegames.de/DX12_Compute.html Kepler removed the hardware scheduler so there is no hardware scheduler on die. Since Fermi they also had Gigathread engine but that is 1 Gigathread that splits workloads, compared to 8 ACE units. http://www.anandtech.com/show/9124/amd-div...hronous-shading QUOTE so we checked with NVIDIA on queues. Fermi/Kepler/Maxwell 1 can only use a single graphics queue or their complement of compute queues, but not both at once – early implementations of HyperQ cannot be used in conjunction with graphics. Meanwhile Maxwell 2 has 32 queues, composed of 1 graphics queue and 31 compute queues (or 32 compute queues total in pure compute mode). The GMU has 32 truly async compute queues, but it is incompatible with DX12 for unknown reasons: http://www.extremetech.com/extreme/213519-...-we-know-so-far QUOTE Maxwell does not have a GCN-style configuration of asynchronous compute engines (ACE) and it cannot switch between graphics and compute workloads as quickly as GCN. According to Beyond3D user Ext3h: “Preemption in Nvidia’s case is only used when switching between graphics contexts (1x graphics + 31 compute mode) and “pure compute context,” but claims that this functionality is “utterly broken” on Nvidia cards at present. He also states that while Maxwell 2 (GTX 900 family) is capable of parallel execution, “The hardware doesn’t profit from it much though, since it has only little ‘gaps’ in the shader utilization either way. So in the end, it’s still just sequential execution for most workload, though if you did manage to stall the pipeline in some way by constructing an unfortunate workload, you could still profit from it.” "Only Maxwell v2 really messed up, as it would require preemption for a context switch between (pure) compute and graphics context, and the preemption isn't working at all currently, so the driver can crash. Oh, and forced sequential execution of compute calls on Maxwell v2 resulted in horrible performance, even though that was expected." https://forum.beyond3d.com/posts/1870728/ Demonic Wrath just sharing abit here. i suggest not worry about it too much, because in the end only actual benchmarks / in game FPS that matters. This post has been edited by svfn: Jul 14 2016, 05:52 PM |
|
|
Jul 14 2016, 05:32 PM
|
![]() ![]()
Junior Member
267 posts Joined: Oct 2007 From: Kuala Lumpur, Malaysia |
QUOTE(Moogle Stiltzkin @ Jul 14 2016, 04:59 PM) is my current i7-3770 be good enough for the ASUS GTX1080 StriX OC 8GB DDR5X 256BIT i preordered ? or will it bottleneck :/ You can use 3dmark Firestrike Advance search, put your CPU and GPU config, and see what results there from other users. Then compare to other CPU setup. Here's a quick snapI think your CPU should do just fine. Maybe in a game that is really CPU intensive and that can utilize more than 4-cores, the difference more noticeable, but most games don't. I've seen a youtube video comparing i7-6700K and i7-5960X with GTX 1080 benching a few games, and they perform almost similar (difference in fps so little compare to the difference in price between the 2 CPUs) |
|
|
Jul 14 2016, 05:32 PM
|
![]() ![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
4,475 posts Joined: Jan 2003 |
QUOTE(napster2142 @ Jul 14 2016, 05:26 PM) I oced the GPU using preset mode by Asus, quite impress because the spec said 1860mhz, but after done gaming, max reached around 20xxmhz i want to max out my MADVR settings. i can't use the more VHQ settings with the 680 GTX, so definitely can with the 1080 My i5-3570k can reach 4.2ghz @ 1.20v stable, already tried 4.4ghz @ 1.21v - game crashes and 4.5ghz @ 1.22v - bluescreen my 2 cent, once u have reached the highest freq, best voltage and stability, then run the freq a bit lower for daily usage already 3 years+ with my ivy run @ 4.2ghz daily on WC just read your post i think it would be okay since mine 3570k can run The Witcher 3 all max. QUOTE Example: You want to use NNEDI3 for image doubling your anime videos, but you have a crappy GPU that isn’t a Titan Black or a R9 290x. Therefore you can only run luma and chroma NNEDI3 image doubling for SD videos, luma NNEDI3 image doubling for HD 720p videos, and nothing at all for Full HD 1080p videos. the full guide for VHQ MADVR settings for anime viewing https://imouto.my/tutorials/madvr/ for newbs that want easy setup, KCP has preset settings http://haruhichan.com/forum/showthread.php...waii-Codec-Pack i still recommend also additionally looking over the VHQ guide at imouto for further setting adjustment especially for creating profiles for videos to do the best setup for the video source types QUOTE(adilz @ Jul 14 2016, 05:32 PM) You can use 3dmark Firestrike Advance search, put your CPU and GPU config, and see what results there from other users. Then compare to other CPU setup. Here's a quick snap when i get my card *not sure when (preordered but no eta I think your CPU should do just fine. Maybe in a game that is really CPU intensive and that can utilize more than 4-cores, the difference more noticeable, but most games don't. I've seen a youtube video comparing i7-6700K and i7-5960X with GTX 1080 benching a few games, and they perform almost similar (difference in fps so little compare to the difference in price between the 2 CPUs) This post has been edited by Moogle Stiltzkin: Jul 14 2016, 05:39 PM |
|
|
Jul 14 2016, 05:35 PM
|
![]() ![]()
Junior Member
205 posts Joined: Feb 2007 |
QUOTE(napster2142 @ Jul 14 2016, 04:09 PM) Hi bro, Cut down on the Grass setting and AA setting will boost your fps to 80-90Wanna ask since your specs are same with, do u play GTA V? mind to share FPS? Thinking of my cpu bottleneck the gpu Anyway I'm using Strix OC GTX1070, boost to 20xxmhz, i5 3750k @ 4.2ghz but got FPS around 30-40 @ 1440p, old game shouldn't it be more FPS? Also depends on the location of your sample. Grass setting is the killer in GTA V PS: I'm using i5 6600k 4.4Ghz with 1070 super jetstream |
|
|
Jul 14 2016, 05:36 PM
|
![]() ![]()
Junior Member
97 posts Joined: May 2010 From: Johor Bahru;Shah Alam |
QUOTE(Moogle Stiltzkin @ Jul 14 2016, 05:21 PM) will do ikr, their promo price is just too tempting..i found them before i bought the SG set, but that time they got no stock and cannot confirm etahttps://forum.lowyat.net/topic/3749533 FYI: ASUS GeForce GTX 1080 8GB ROG STRIX OC = (STRIX-GTX1080-O8G-GAMING) http://videocardz.com/nvidia/geforce-1000/geforce-gtx-1080 https://www.asus.com/us/Graphics-Cards/ROG-...080-O8G-GAMING/ thx to napster2142 for headsup, sadly i saw too late QUOTE(adilz @ Jul 14 2016, 05:32 PM) You can use 3dmark Firestrike Advance search, put your CPU and GPU config, and see what results there from other users. Then compare to other CPU setup. Here's a quick snap thanks for sharing, that will make me want to stay another year with my ivy I think your CPU should do just fine. Maybe in a game that is really CPU intensive and that can utilize more than 4-cores, the difference more noticeable, but most games don't. I've seen a youtube video comparing i7-6700K and i7-5960X with GTX 1080 benching a few games, and they perform almost similar (difference in fps so little compare to the difference in price between the 2 CPUs) |
|
|
|
|
|
Jul 14 2016, 05:40 PM
|
![]() ![]()
Junior Member
97 posts Joined: May 2010 From: Johor Bahru;Shah Alam |
|
|
|
Jul 14 2016, 05:48 PM
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,256 posts Joined: May 2008 From: Gotham City |
Anyone got any clue on who or where can I pre order or pay a deposit fee to order MSI GTX 1080 GAMING Z ?
|
|
|
Jul 14 2016, 05:55 PM
|
![]() ![]()
Junior Member
123 posts Joined: Jan 2013 From: Sabah |
Moogle Stiltzkin napster2142
Wah..that promotion price make my hand itchy leh Trusted seller kah ? Cannot find any feedback This post has been edited by Xteoh: Jul 14 2016, 05:57 PM |
|
|
Jul 14 2016, 06:19 PM
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,667 posts Joined: Jan 2003 From: The Cool Name Place |
QUOTE(svfn @ Jul 14 2016, 05:29 PM) DX12 Multi engine capabilties of recent AMD and Nvidia hardware (Kepler, Maxwell v1 (750 Series) and Maxwell v2 (900 Series)) Err... Kepler simplified the hardware scheduler, not removed it... in the hardware, it needs to have a scheduler to keep track on which SM is idle, which SM can be retasked.. and so on. It is not reasonable if this tasks need to go back to CPU due to latency.http://ext3h.makegames.de/DX12_Compute.html Kepler removed the hardware scheduler so there is no hardware scheduler on die. Since Fermi they also had Gigathread engine but that is 1 Gigathread that splits workloads, compared to The GMU has 32 truly async compute queues, but it is incompatible with DX12 for unknown reasons: http://www.extremetech.com/extreme/213519-...-we-know-so-far Demonic Wrath just sharing abit here. i suggest not worry about it too much, because in the end only actual benchmarks / in game FPS that matters. From their Kepler whitepaper: QUOTE We also looked for opportunities to optimize the power in the SMX warp scheduler logic. For example, both Kepler and Fermi schedulers contain similar hardware units to handle the scheduling function, including: a) Register scoreboarding for long latency operations (texture and load) b) Inter‐warp scheduling decisions (e.g., pick the best warp to go next among eligible candidates) c) Thread block level scheduling (e.g., the GigaThread engine) As far as the Gigathread is concerned, it has 32 hardware managed queues that can support graphics/compute tasks. It seems it can be repurposed using driver. GTX970: http://vulkan.gpuinfo.org/displayreport.ph...7#queuefamilies R9 200 series: http://vulkan.gpuinfo.org/displayreport.ph...4#queuefamilies If you noticed, GTX970: 16 queues that can support GRAPHIC/COMPUTE/TRANSFER, 1 queue that can support TRANSFER R9 200 series: 1 queue that can support GRAPHIC/COMPUTE/TRANSFER, 7 queue that can support COMPUTE/TRANSFER, 2 queue that can support TRANSFER |
|
|
Jul 14 2016, 06:20 PM
|
![]() ![]() ![]() ![]()
Junior Member
500 posts Joined: Oct 2015 From: Penang |
QUOTE(Demonic Wrath @ Jul 14 2016, 06:19 PM) Err... Kepler simplified the hardware scheduler, not removed it... in the hardware, it needs to have a scheduler to keep track on which SM is idle, which SM can be retasked.. and so on. It is not reasonable if this tasks need to go back to CPU due to latency. then how do you explain the small gains on Vulkan api in DOOM with Maxwell 2 (900 series)?From their Kepler whitepaper: As far as the Gigathread is concerned, it has 32 hardware managed queues that can support graphics/compute tasks. It seems it can be repurposed using driver. GTX970: http://vulkan.gpuinfo.org/displayreport.ph...7#queuefamilies R9 200 series: http://vulkan.gpuinfo.org/displayreport.ph...4#queuefamilies If you noticed, GTX970: 16 queues that can support GRAPHIC/COMPUTE/TRANSFER, 1 queue that can support TRANSFER R9 200 series: 1 queue that can support GRAPHIC/COMPUTE/TRANSFER, 7 queue that can support COMPUTE/TRANSFER, 2 queue that can support TRANSFER ![]() This post has been edited by svfn: Jul 14 2016, 06:55 PM |
|
|
Jul 14 2016, 06:44 PM
|
![]() ![]()
Junior Member
97 posts Joined: May 2010 From: Johor Bahru;Shah Alam |
|
|
|
Jul 14 2016, 06:52 PM
|
![]() ![]() ![]() ![]() ![]() ![]()
Senior Member
1,667 posts Joined: Jan 2003 From: The Cool Name Place |
QUOTE(svfn @ Jul 14 2016, 06:20 PM) Few reasons:-1) NVIDIA Vulkan driver has not exposed compute-only queues. Probably won't make much of a difference. 2) Not just Maxwell 2 showing small gains. Pascal too showing small gains. But a gain is a gain. As you can see in the graph you posted, GTX980Ti showing higher gains than GTX1070. This video shows it too: https://www.youtube.com/watch?v=ZCHmV3c7H1Q 3) NVIDIA GPUs already has almost peak utilization averagely. 4) Some scene showing large gains too on NV hardware. Large gains also in some CPU limited scene. 5) This game has AMD shader intrinsic function (specific to AMD). It is not supported by NVIDIA shader extension in current Vulkan driver. 6) AMD FuryX has 23% more compute performance than GTX1070. At peak, it will perform 23% faster. So, it is performing as it should. 7) As mentioned before, AMD OpenGL driver has high overhead issue. Once this issue is not there, it will perform as it should. If NVIDIA cripple their OpenGL driver, you'd see significant gains too going to Vulkan (do people prefer this?). There's obviously something wrong if GTX970 can perform similar to FuryX in OpenGL mode...If Fury X can outperform GTX1080 in Vulkan, then that means NVIDIA is not performing as good as AMD in Vulkan. But it is not the case here, GTX1080 is still leading. |
|
Topic ClosedOptions
|
| Change to: | 0.0165sec
0.57
6 queries
GZIP Disabled
Time is now: 21st December 2025 - 01:21 AM |