Welcome Guest ( Log In | Register )

Outline · [ Standard ] · Linear+

 Intel Arc B580 12GB

views
     
babylon52281
post Jan 1 2025, 02:38 PM

Look at all my stars!!
*******
Senior Member
2,706 posts

Joined: Apr 2017
QUOTE(hdbjhn2 @ Dec 31 2024, 08:23 PM)
Read la brader, that's exactly what i was saying. Don't buta2 follow reviews. U think reviews always cover a product good engh?
I owned products where in reality there were catches, things that really mattered, but no reviews covered it or touched it.
But when i google it, randomly there will be people with same problem on forums, whch nowdays harder to find cause google search mechanism only shows
automated webpages that only buta2 promotes the specs.

Mybe i haven't watch all reviews, i admit, but,, try to take it pinch of salt.
Reviewers seems to try pushing it a lot, cause, they also need to cari makan ma.
Relax, we are consumers, no wrong in questioning, we are not supporters of any cause.
*
Exactly bro, but some folks are too caught up with the hype and being influenced by youtubers & influencers that they only spout all the good things but ignore all the bad things bcoz its not their money they try to convince people to part for Intel GPU. If these people actually spend their money to use Arc as their daily driver GPU for gaming & all use and not complain and secretly go back to Nvidia/AMD, then Im convinced lar.
babylon52281
post Jan 1 2025, 02:56 PM

Look at all my stars!!
*******
Senior Member
2,706 posts

Joined: Apr 2017
QUOTE(Tsuki91 @ Jan 1 2025, 01:48 AM)
Hopefully in a few days most of the shops bring in a few ARC B580s so I can start buying it.
*
Few days? Unlikely. Wait about til CNY as its sold out everywhere in Western nations except for some scalpy overpriced stocks.
babylon52281
post Jan 1 2025, 10:31 PM

Look at all my stars!!
*******
Senior Member
2,706 posts

Joined: Apr 2017
QUOTE(TristanX @ Jan 1 2025, 04:24 PM)
But this round. Reviews are genuine. USD 250 for B580 12GB with more performance, 270 for RX 7600 8GB and 300 for RTX 4060 8GB. Unless you like to pay more for less, and SOMETIMES STUTTER ON GAMES THAT WANTS MORE THAN 8GB! or POOR LOOKING GAME DUE TO OUT OF VRAM! Intel drivers improved a lot comparing to gen 1 launch as well.

*
Right. Its the same shill we heard about AMD 6000series & 7000series, yet the market speaks differently. Lets see how this round will turn out.
babylon52281
post Jan 4 2025, 09:04 AM

Look at all my stars!!
*******
Senior Member
2,706 posts

Joined: Apr 2017
The original finding was here


I will repeat this as expected... its the software stack. For older & new lower end budget builders (ironically the segment that B580 is targeting) its advisable to wait until Intel gets it right. But it may never happen too (Nvidia never resolve their driver higher overhead issues but it was never this bad either).

Lets see how the fanbois try to deny this rclxms.gif

This post has been edited by babylon52281: Jan 4 2025, 09:06 AM
babylon52281
post Jan 4 2025, 05:10 PM

Look at all my stars!!
*******
Senior Member
2,706 posts

Joined: Apr 2017
QUOTE(TristanX @ Jan 4 2025, 10:42 AM)
I got a feeling resizeable bar not working properly. It was not intended for CPUs that old (no resizeable bar [or AMD SAM] until Ryzen 5000 + Radeon 6000 launch).

System requirements:
https://www.intel.com/content/www/us/en/sup...ics-family.html

From the box:
https://www.techpowerup.com/review/intel-ar...-preview/3.html
*
Hardware Canucks (the discoverer of this bug) tested with 9600K & Z390 with Rebar ON & OFF both settings had this issue so its nothing to do with Rebar. Go watch their video.

This post has been edited by babylon52281: Jan 4 2025, 05:11 PM
babylon52281
post Jan 5 2025, 04:20 PM

Look at all my stars!!
*******
Senior Member
2,706 posts

Joined: Apr 2017
QUOTE(kingkingyyk @ Jan 4 2025, 09:11 PM)

Zen+ is too slow for games nowadays anyway. Rebar support is unknown since the official word from AMD is 3000 series and above.
Given how cheap Zen 3 these days, user could just get a 5600 (or 5500) as a drop in replacement.  wink.gif
*
More often than not most people dont upgrade CPU as they dont "feel" the performance difference. This unlike a GPU upgrade where games can go from unplayable to smooth FPS.

For these people upgrading an older system, a Nvidia or AMD GPU (esp AMD) will have less overhead limitations.

This post has been edited by babylon52281: Jan 5 2025, 04:20 PM
babylon52281
post Jan 5 2025, 04:33 PM

Look at all my stars!!
*******
Senior Member
2,706 posts

Joined: Apr 2017
QUOTE(chocobo7779 @ Jan 5 2025, 07:45 AM)
The HWUnboxed investigation about the CPU overhead is also why you should always review budget GPUs with right sized CPUs.  I know many reviewers use high end CPUs to reduce as much bottlenecks as possible, but no one will ever use something like a 9800X3D with a USD250 GPU sweat.gif

*
Such testing methodology isnt wrong per se as it removes the CPU equation from being a limiting factor thus what they want to show is a clear FPS difference. ie if using a 9800x3d you might be able to see a 10FPS difference from a rival but if using say 12400F this will drop to 1-2 FPS and you might go "Meh..." and buys overpriced nvidias yawn.gif

Ideally, reviewers should be doing 2 type of tests; a high end 9800x3d test to show a clear FPS difference and a older/lower end system for budget GPU to show real world scenario where these are likely to be used. However such a test will degenerate into a "buy whichever model is the cheapest in your region/country" which doesnt help them to push certain narrative (go AMD! go Arc! anti-nvidia!).
babylon52281
post Jan 5 2025, 11:49 PM

Look at all my stars!!
*******
Senior Member
2,706 posts

Joined: Apr 2017
QUOTE(kingkingyyk @ Jan 5 2025, 05:17 PM)
Well there is actually difference even in javascript execution. I went from Zen 2 > Zen 3 > Zen 5, all upgrades enhanced the web site loading by noticeable margin thanks for the wide use of client side rendering these days. Zen+ is really laggy especially in Electron apps due to low single threaded performance.
*
You have specific use case where 90% users wont find. Most wont notice a lag of up to 4 secs more where your difference between CPU and even those do wont be too bothered.
babylon52281
post Jan 5 2025, 11:55 PM

Look at all my stars!!
*******
Senior Member
2,706 posts

Joined: Apr 2017
QUOTE(Tsuki91 @ Jan 5 2025, 07:17 PM)
Truth be told, hardware reviewers should AT LEAST use a mid-range CPU (e.g Intel Core i5 Series or AMD Ryzen 5 Series) for GPUs below $400 range so those who's PC is more or less the same range can expect what they're getting. CPUs like the 9800X3D should be reserved for GPUs above the $400 range imo.
*
Yes and no. As I explained above you wont be too picky where its a 1-2 FPS but it wont do for a review as the diff isnt clear cut then. Its like comparing 2 olympic runners raw speed after removing all equipment difference between both as you want to know who is actually faster.
babylon52281
post Jan 6 2025, 12:19 PM

Look at all my stars!!
*******
Senior Member
2,706 posts

Joined: Apr 2017
QUOTE(kingkingyyk @ Jan 6 2025, 11:10 AM)
Incorrect. Companies have been pushing web "native" app to everywhere.
Teams, Skype, Discord, Spotify are all using javascript as UI.
Driver software like NVIDIA Geforce Experience, ASUS Framework, WD Dashboards, are javascript too. You can find people ranting about the high CPU usage on the NodeJS process, not so for CEF.
Feeling slow with your Windows 11 start menu? It is javascript!

Javascript is notoriously inefficient and only single threaded.
*
Again yes and no. Not everything uses javascript and high CPU usage isnt something 'regular' PC users would notice other than CPU fan ramping up louder. They might notice certain background programs runs a bit slower when loading too many things ongoing but generally people wont understand what is a 'NodeJS' to care about a few more secs delay to warrant a CPU upgrade.

And Win11 is sluggish becoz... its Win11, with all the unnecessary bells & whistles and AI and telemetry. Its been done to death why Win11 is a hog. I have dual boot Win10 & Win11 and regularly switch around and while I do 'feel' a general sluggishness its not to a point I feel going back to Win10 permanently.
babylon52281
post Jan 6 2025, 04:03 PM

Look at all my stars!!
*******
Senior Member
2,706 posts

Joined: Apr 2017
QUOTE(Tsuki91 @ Jan 6 2025, 02:33 PM)
I find Windows 11 runs best with a local account, since it doesn't bug you each time about turning on backups, eats resources in the background, annoy you with Windows Updates etc.
*
Even with local acct, if your installing via the MS intended way you will still have lotsa telemetry stuff & other automated processes getting thru. If install via Rufus it will help to deactivate a lot of these or prevent them being installed. If use Tiny11 even less hog but Im not comfy with using modded OS.

FWIW Im running clean Win11Pro with all those turn off as much possible. Not sure if can do that on Win11Home tho.

EDIT: Actually if it asks you (like Win7/10), its a good thing as you got an option not to, but Win11 allows (without your ok) to run in background which pisses a lot of users.

This post has been edited by babylon52281: Jan 6 2025, 04:08 PM
babylon52281
post Jan 6 2025, 11:43 PM

Look at all my stars!!
*******
Senior Member
2,706 posts

Joined: Apr 2017
HUB have conlanfirmed it is a driver overhead issue. Again software, people!

babylon52281
post Jan 7 2025, 08:44 AM

Look at all my stars!!
*******
Senior Member
2,706 posts

Joined: Apr 2017
Getting a bit sidetracked about CPU upgrade thingy but admittedly if there is any reason why a regular PC user who had even a slightly older system (with REBAR) would want to upgrade the CPU, is bcoz they made the mistake to get an ARC GPU

user posted image

So heres a tip, if you had upgraded to a B580 and your system is say 5 years older, you need to upgrade the CPU as well to fully utilise this GPU. You can lose up to 40% if you dont. For older Intel users you might have to change the platform too.

In that case you might wonder, if you had that much money why didnt you go for a 7800XT/4060TI 16GB. Oops
babylon52281
post Jan 10 2025, 08:40 AM

Look at all my stars!!
*******
Senior Member
2,706 posts

Joined: Apr 2017
QUOTE(Doraku @ Jan 10 2025, 08:07 AM)
Arc B580 is RM1809 on Synapse Malaysia, I think its dead on arrival when you can buy 4060TI 8GB for same price.
*
Not official distro likely parallel imported so jack up the pricing to fish for early bird waterfishes who want to try out. Also I see Sparkle Orc? 3Fan version in Shopee for RM 2088. Go figure.

Better wait for official pricing from resellers.
babylon52281
post Jan 11 2025, 12:51 AM

Look at all my stars!!
*******
Senior Member
2,706 posts

Joined: Apr 2017
QUOTE(Tsuki91 @ Jan 10 2025, 07:13 PM)
My budget is RM1500, no more, no less. 4060s is not on the list coz the price-to-performance is not as good as the B580s. My list of games is mostly VRAM-intensive and the 4060 is not up to task since I also don't rely on software crutches like DLSS, let alone with FG as the increase in overall latency (even with Reflex) is a turn-off for me. I just hope next week there will be some kind of news regarding about B580s popping up in the local market at reasonable prices.
*
China factories (where these GPUS are made) will begin shutting down from 20th onwards until 1 week after CNY. If you havent seen it here, you should expect it by after CNY.
babylon52281
post Jan 14 2025, 03:17 PM

Look at all my stars!!
*******
Senior Member
2,706 posts

Joined: Apr 2017
Its funny how fanbois continue to catch anything on surface to fight their lost cause. Want to talk about nvidia driver overhead too? Well take a look.

user posted image
https://www.techspot.com/review/2940-intel-...-b580-rereview/

Is nvidia drop real? Well yes... vs Radeons. But see
AMD lost 5 FPS
Nvidia lost 10 FPS
ARC BMG lost 20 FPS

Mind you that video was making a point of Nvidia vs Radeons. Nvidia GPUS scales relative similarly when going down the CPU ladder. Meanwhile the main issue that is hot and that fanbois trying to deflect is BMG is killing itself with slower & older CPUS.

It is B580 (with 9800x3d) vs B580 (with 5600) !

And here a damning verdict from Techspot/HUB "Anything slower than the Ryzen 5 5600, and the recommendation shifts firmly to a Radeon or GeForce GPU instead. For the Arc B580 to maintain the excitement we initially had for it, it really needs to be paired with a Ryzen 5 7600, Core i5-13600K, or a better CPU."

This post has been edited by babylon52281: Jan 14 2025, 03:32 PM
babylon52281
post Jan 14 2025, 10:35 PM

Look at all my stars!!
*******
Senior Member
2,706 posts

Joined: Apr 2017
LOL someone thinks imma fanboi? I guess you arent here long enough to see that I whack all of them when they are bad but reco certain ones when they are value for money.

Fanboi like to bring up the earlier positive reviews from influencers about B580 but now turn around cannot accept their rereviews and damning verdict? LOL selective much? But i guess thats what fanbois do.

Want to whack 4000series? Go ahead, Nvidia last good value was the ori releases of 3000series. Everything after was overpriced crap and barely better than previous gen w/o relying on cheats like FG.

Want to use a review with mere 15 game comparison to say Nvidia overhead is no big deal? Try TS/HUB 50 games comparo, yes 5-0. Lima puloh. Bigger data wins. The overhead is real, just not as crap as Arc.

Think imma AMD fanboi? You mean you liked that AMD Msia is smoking something strong thinking their GPUs are hot selling shits and price them the same as Nvidia when globally their MSRP is priced one tier below each Nvidia segment? You like paying local overprices for Radeons? Haha.

Budget buyers dont have much recourse if they made a mistake buying the wrong thing and I try to help them making proper choice when they have all the info. If you think imma fanboi well too bad for you.

This post has been edited by babylon52281: Jan 14 2025, 10:38 PM
babylon52281
post Jan 14 2025, 10:47 PM

Look at all my stars!!
*******
Senior Member
2,706 posts

Joined: Apr 2017
QUOTE(bluecat68 @ Jan 14 2025, 09:53 PM)
The card is still not available in the market and that can be a good thing for anyone who interested to learn more about it before decide.

*
For now wait and see if Intel can fix this overhead issue, but then again Intel isnt as strong in the soft side as they do on the hardware level. Think Spectre & Meltdown firmware hole and the crippling loss after patching, think the 3 (so far!) bios releases in 1 month to stop 13/14Gen CPU degradation, think Arc driver bugs for some AAA games to crash on Day1, think driver overhead of today. Imma not confident Intel can solve it anytime soonish.


babylon52281
post Jan 15 2025, 03:14 PM

Look at all my stars!!
*******
Senior Member
2,706 posts

Joined: Apr 2017
For those who are still keeping a level head and not taken by all the shilling and later defensive tactics of fanbois, Chips & Cheese did a deep dive into the reason for Intel Arc poor driver overheads to try and find an explanation.

https://chipsandcheese.com/p/digging-into-d...rhead-on-intels

For those in the know, C&C reviews are deep end, real deep borderline on engineering discussion. Their other reviews on each CPU uarch merits, pros & cons are quite interesting if you want to know how your modern AMD/Intel CPU ticks.

Anyhow they tried to find where Arc BMG perf loss went, they kinda failed to find the silver bullet. In DX11 it seems Arc needs to interact with CPU a lot more than AMD in comparison, but their driver kernel is less efficient leading to said overhead loss. However DX12 & Vulkan overhead appears similar to AMD which doesnt explain the big losses when using older CPU.
Maybe its due to older CPUS are using older & slower data bus to GPU as compared to newer systems? When coupled with Arc poor CPU interface, it induces more lags into the process leading to bigger performance loss? Hmm...
babylon52281
post Jan 15 2025, 09:59 PM

Look at all my stars!!
*******
Senior Member
2,706 posts

Joined: Apr 2017
As I said many pages back, Intel is still new into the dGPU game, they need minimum 3 generations (up to Celestial) to even start be competitive and that if they can take their heads out of their arses but its still stick inside today so give it at least to 4th Gen (Dungeonmaster?). Nvidia & AMD(ATI) had decades to maturise their hardware to maximum optimisation levels. This on top of software/driver issues which Intel is famous for dragging their feet.

And when I point out these truths fanbois started defending LOL. Everyone threw logic out the window and yet the hard truth still comes back!

3 Pages < 1 2 3 >Top
 

Change to:
| Lo-Fi Version
0.0260sec    0.94    7 queries    GZIP Disabled
Time is now: 18th December 2025 - 12:24 PM