Welcome Guest ( Log In | Register )

3 Pages  1 2 3 >Bottom

Outline · [ Standard ] · Linear+

 Intel 13th/14th gen cpus crashing, degrading

views
     
babylon52281
post Apr 15 2024, 04:51 PM

Look at all my stars!!
*******
Senior Member
2,673 posts

Joined: Apr 2017
QUOTE(imbibug @ Apr 15 2024, 03:01 PM)
Around the 10th gen or so, Intel seems to be juicing up their cpus in order to look competitive with Ryzen cpus.
user posted image
And Intel has done this sort of thing in the past, just not to the extent of what is going on now.

This problem has just gotten way worse since. The current 13th/14th gen cpus will be degraded at stock settings because Intel has ridiculously high power limits. The default limits are already very high and the 'extreme' ICCmax for 150W TDP 13th/14th gen cpus is 400A is clearly crazy high. The average pc builder or gamer is not going to know that leaving the settings at default will degrade their cpus permanently in a few short months.

Recently Nvidia pushed back by telling users with Raptor Lake cpus to contact Intel after getting "out of video memory" errors.
https://www.tomshardware.com/pc-components/...t-intel-support
And even worse, it looks like the Raptor Lake cpus have degraded in a few months to a point where crashing occurs.
https://www.lowyat.net/2024/320284/gamers-r...-cpus-en-masse/

Sad to see how Intel sunk to to the level where they are putting the blame on mobo manufacturers for not enforcing limits LOWER than their own specs for long term reliability.
*
Whoever buys i9 purely for games really got more money than sense. Its more of a HEDT CPU and if you want proper gaming CPU go for i7 which doesnt seem to be affected.

Anyways both teams are not being honest with their power draws of course Intel is much worse simply because their silicon quality allows Kskus to draw more power than spec but its also mobo maker fault for actually pushing the CPU beyond reasonable limits when removing power limiters.

Its basically Ksku CPU is a car without brakes and then certain particular mobo makers removing RPM limiter too. What happens? The car will go as fast as it physically can until it crashes, the same goes for an unlimited CPU too, it figuratively crashes.

Will it degrade the CPU? Yes eventually but unless your running 24/7 at the max turbo limit constantly it shouldnt die so soon. Games typically dont even run CPU to its max capacity unless theres some weird missmatch combo to hit a hard CPU bottleneck.
babylon52281
post Apr 15 2024, 05:04 PM

Look at all my stars!!
*******
Senior Member
2,673 posts

Joined: Apr 2017
QUOTE(Duckies @ Apr 15 2024, 03:11 PM)
Yes...and I have to downvolt + limit power for my 14700k to prevent high temperature...cilaka

In fact AMD CPU and GPU now is the best performance/price. If not because of Intel's good marketing...
*
You use O11 fish tank case sure will easily overheat coz these cases have pisspoor airflow. Get a Lancool3 with the same parts inside and temps will be much different

babylon52281
post Apr 15 2024, 05:06 PM

Look at all my stars!!
*******
Senior Member
2,673 posts

Joined: Apr 2017
QUOTE(BL98 @ Apr 15 2024, 03:31 PM)
Time to buy AMD stock or better to top up NVDA?
*
Buy ARM or Qualcomm stocks. Their CPU destroys both X86 in terms of power efficiency. ARM is the future.
babylon52281
post Apr 15 2024, 05:52 PM

Look at all my stars!!
*******
Senior Member
2,673 posts

Joined: Apr 2017
QUOTE(Duckies @ Apr 15 2024, 05:08 PM)
Fish tank case more cantik tongue.gif At first I want to for O11 Vision some more...that 1 the airflow lagi teruk
*
You traded performance for looks so dont complain lar tongue.gif

In fish tanks you cannot push to the max so its really a balance of keeping temps in check just for the looks. For such usage I would prefer nonK CPU less temp headaches.

This post has been edited by babylon52281: Apr 15 2024, 05:52 PM
babylon52281
post Apr 20 2024, 09:41 PM

Look at all my stars!!
*******
Senior Member
2,673 posts

Joined: Apr 2017
QUOTE(hashtag2016 @ Apr 20 2024, 01:32 PM)
Don't worry , be happy.. I think more and more mobo maker will include something like Intel DEFAULT setting to help users trobleshoot their PC issue(if any).
Since, Asus already Introduce "Intel Baseline Profile option " in their latest BIOS..it is a positive move, I think.. brows.gif

although u have to choose between Stability or Performance .. drool.gif
https://twitter.com/9550pro/status/1781481593972129929

p/s: IMHO, I think even gen 12nd and some of the non-K might also face similar issue. (that's just my personal thought, don't quote me please). brows.gif  brows.gif
*
So far no stability issues with my PL2 unleashed nonK 12700F. Been running for 1.5 years with my Asrock B660M PG Riptide mobo. In games it doesnt really hit its maxed turbo power limit so Im still contemplating if to do BCLK OC to hit 12700K speeds.
babylon52281
post Apr 21 2024, 10:14 PM

Look at all my stars!!
*******
Senior Member
2,673 posts

Joined: Apr 2017
QUOTE(Baconateer @ Apr 21 2024, 03:39 PM)
Im glad i went with amd am5

Considered getting 12/13th gen i5 1X400 chip...

But the no upgrade path after 14th gen made me reconsidered.
*
Futureproofing is fallacy. Recall the AM4 300series to 400series issues and also their limited growth (no PCIE4, no USB 3.2) by todays standards even if you can technically still "upgrade" with newer CPU.

AM5 1st gen is held back by limited DDR5 speeds. This will fast gets old as DDR5 tech matures and speeds goes over 10k.

Do not buy for something you think might happen future, just buy for the best today.

This post has been edited by babylon52281: Apr 21 2024, 10:21 PM
babylon52281
post Apr 21 2024, 10:20 PM

Look at all my stars!!
*******
Senior Member
2,673 posts

Joined: Apr 2017
QUOTE(hashtag2016 @ Apr 21 2024, 08:18 PM)
I think it is a good feature lah..  although not everybody need to use it..
At least if sb think their build got some stable issue even without turn on XMP, then they can straight forward turn on this switch  .
and if the problem still persist, then they can straight forward go to claim warranty(this only just  my opinion) , no need to gaduh with the PC shop or any Seller.. brows.gif  icon_idea.gif

Anyway, we still need to wait the officer investigation result from intel, to see if they find anything interesting.. hmm.gif
*
Not sure what XMP gotta do with it. Whether Ksku or nonK both can run XMP at default settings (with right mobo), and the issues are more towards people running their CPUS way over default power limit settings due to their mobos auto set to OC mode when detect Ksku installed.
babylon52281
post Apr 22 2024, 09:43 AM

Look at all my stars!!
*******
Senior Member
2,673 posts

Joined: Apr 2017
QUOTE(awol @ Apr 22 2024, 08:04 AM)
ARM maybe, but RISC-V, tired of YT headline RISC-V is the future for many years already.
seems like only intel make mistake while AMD make money.
*
AMD mistake is keeping AM5 prices too f**king high, HUB recently did a review and their basement budget Mobos are basically all crap so you cannot do a budget build with AM5 either, which is why AM4 is still hanging around and not getting replaced as it was supposed to be. This is a failure from product marketing POV.

And for a fabless maker, they also failed to prioritise software side of the CPU leading to systems being less stable (DOCP/EXPO, CCD priority) than Intel build (who still has to run a fab business). In that regards, heck even Nvidia has better grip on their various software ecosystems for other uses of their GPU.

Both have their faults and surprising it is Apple, another fabless brand, that is showing how to design CPU hardware with their superb M2 Ultra, with near flawless software integration. And then there is ARM...

Apple, ARM > Intel, AMD

All you AMD & Intel fanbois can balik rumah cry
babylon52281
post Apr 22 2024, 03:53 PM

Look at all my stars!!
*******
Senior Member
2,673 posts

Joined: Apr 2017
QUOTE(awol @ Apr 22 2024, 09:55 AM)
not an AMD & intel fan boi  brows.gif
this year ARM laptop from qualcomm will hit the market (again), see how it goes.

i like macOS, but i dont like apple.
*
Never had an Apple product whatsoever and never figured I will ever need one but I can appreciate their innovations to push electronics design boundaries down to the CPU arch itself. I just dont agree Apple pricing for such innovations.
babylon52281
post Apr 23 2024, 05:57 PM

Look at all my stars!!
*******
Senior Member
2,673 posts

Joined: Apr 2017
QUOTE(lolzcalvin @ Apr 23 2024, 03:09 PM)
for now. microsoft is currently betting on snapdragon x elite for their Windows on ARM. snapdragon x elite does look promising.

however, x86-64 may not be going anywhere soon despite the rising popularity of RISC-based instruction set architectures (ISAs).

the reason why windows still requiring x86-64 (for now) is because they have a lot of legacy codes that are dependent on x86 ISA. many wizards we use in windows today can be dated back to win95. this is why x86-64 hangs around for so long -- backwards compatibility for the past 20+ years. hell even intel 8086-based software can run on modern x86 CPUs with little tweaking. people may attribute better efficiency to ARM, but it really matter not on the difference in ISAs, but how CPU vendors are pushing at which direction while designing their CPUs. for so long both AMD and Intel been pushing towards high performance (both of them compete in high performance computing while having a lower priority on battery life), while Qualcomm and Apple have been pushing for efficiency (both of them compete in mobile space and therefore having handful of experience on low power operations).

a good read of RISC vs CISC if you're into a bit of technicality: https://chipsandcheese.com/2021/07/13/arm-o...-doesnt-matter/
if u don't already know, x86-64 is CISC-based while ARM and RISC-V are RISC-based
*
Windows on ARM have been tried before and failed (see Windows RT). Mainly coz no backwards portability of current vast Windows software library to ARM, so those who buys Windows ARM PC were disappointed they cannot run their usual softwares.
For Windows ARM to be success they need to have easy portability or majority ARM compatible softwares ready from Day1.
babylon52281
post Apr 24 2024, 10:31 AM

Look at all my stars!!
*******
Senior Member
2,673 posts

Joined: Apr 2017
QUOTE(TristanX @ Apr 23 2024, 11:11 PM)
Last major breakthrough is Sandy Bridge. Its been a while. Today, Intel is still behind when comes to the hardware. They still able to keep up. Just with a lot more power.

There are candidates for new materials. Like 100Ghz nanocarbon I think. I think no one able to get it stable.

Lots of things to consider too. Now we have a lot of hackers cracking it. Security patch usually nerfs the processors.
*
Arguably Alderlake is also a major breakthru being the 1st consumer big.little hybrid X86 CPU, next breakthru would be Lunarlake with proper tiled design desktop CPU.
babylon52281
post May 5 2024, 04:34 PM

Look at all my stars!!
*******
Senior Member
2,673 posts

Joined: Apr 2017
AMD fanbois conveniently forgot that their Ryzen had its own CPU failure before Intel did?
https://www.lowyat.net/2023/299165/amd-ryze...800x3d-burnout/

I just love how in that previous case was Intel fanbois pissing on AMD and today we have AMD fanbois pissing on Intel. Well guess what guys from both camps, both AMD & Intel have their share of hardware design failures. Its to be expected when humans are to design things that are billions in scalar and even the best human effort will still have that 0.001% failure which WILL HAPPEN considering how many CPUS they sell.

So to those fanbois from both camps just chill out, any issue that are under warranty will he resolved thru RMA, its not end of the world for you. And once you have calmed down here is a good read for you guys
https://hwbusters.com/freestyle/are-you-sti...cle-is-for-you/
babylon52281
post May 9 2024, 08:12 PM

Look at all my stars!!
*******
Senior Member
2,673 posts

Joined: Apr 2017
QUOTE(hashtag2016 @ May 9 2024, 03:31 AM)
To be fair, I don't think 7800X3D  cases  affect much people, since it was very expensive by that time, not many people bought it ,and the problem was solved  fast .
The intel  issue affect more widely, and cause more 'brain' damage, although it seems not deadly so far..  drool.gif

p/s: I think people already given Intel special vip treatment, not may posts was created. if this intel inccident were happen on AMD, cannot imaging how many posts and threads will be created  from those angry users.. brows.gif
*
Brother, 7800X3d is in the same segment & just as pricy or even less than 13900K & 14900K lar. It didnt affect many bcoz AMD failed to sell as much due to pricing topkek fail. That itself has failed potential buyers at the front gate. In terms of percentage users failure then it could mean that more percentage of AMD users were affected compared to Intel.

AMD problem did NOT solve fast. It took them many burnt CPUS & a whole month to come out with a non buggy Agesa bios to set it back to baseline voltage.

So stop trying to defend them fanboi. Each of them when there is hardware failure they are responsible. When mobos overspec the CPU mobo makers are responsible.
babylon52281
post May 11 2024, 03:36 PM

Look at all my stars!!
*******
Senior Member
2,673 posts

Joined: Apr 2017
QUOTE(hashtag2016 @ May 10 2024, 11:49 PM)
U can have ur own different opinion on the matter or the product. (this is what's forum exist for)
but suka suka shout out fanbois this fanbios that, it is not nice...  hmm.gif devil.gif

Seems that intel will pulish a officer announcement this month, so they do know how serious the issue is. icon_idea.gif

p/s: I only mention the that x3d burning issue after sb had mention about it , although it is simply unrelated issue to the topic. brows.gif
*
Lol fanbois & haters kena burn by the truth so you folks dont like it, well tough. Yes you can voice your opinion but to do that and conveniently ignore that others have opinion is just meant your a fanboi.

Intel will release an official reply, just as AMD did with their own CPU burn case, that is given.

Stop with the pissing posts then maybe people will have respect what you say.
babylon52281
post May 11 2024, 03:40 PM

Look at all my stars!!
*******
Senior Member
2,673 posts

Joined: Apr 2017
QUOTE(stella_purple @ May 11 2024, 02:42 AM)
amd one is worse, it may turn into fire hazard laugh.gif

*
Both sides CPU have issues when pushed too far le. This is what happens when zero efficiency and both sides ramp up the power game then make worse allow users to go even further.

ARM & RISC V FTW!

This post has been edited by babylon52281: May 11 2024, 03:41 PM
babylon52281
post May 11 2024, 03:52 PM

Look at all my stars!!
*******
Senior Member
2,673 posts

Joined: Apr 2017
QUOTE(chocobo7779 @ May 11 2024, 08:22 AM)
No PCIe 4?  Not that much of a problem for most GPUs unless you are talking something about RX6400/6500XT with their nerfed PCIe x4 interface.  Even the mighty 4090 only loses about 2% performance on PCIe 3.0 x16:

On the SSD side of things there's not much difference between PCIe 3.0 and 4.0 either for gaming icon_idea.gif

USB 3.2?  That's more or less belong to the 'nice to have' territory rather than must haves, and even that can be done with a PCIe card if you really need one icon_idea.gif
*
Nah not on GPU, maybe only 4090 can fully utilise the x16 on a Gen4 bandwidth. It is for M2 SSD and even if you dont see much benefits today, DS will allow faster drives to load world details with less latency meaning a less laggy gaming experience.

Its the saying that few years ago people might not see the need for >8GB VRAM on GPU but oh boy isnt that an issue for more & more todays games. If games world building becomes bigger & more detailed, it will sooner need to load direct from SSD at faster pace.
babylon52281
post May 11 2024, 04:10 PM

Look at all my stars!!
*******
Senior Member
2,673 posts

Joined: Apr 2017
QUOTE(lolzcalvin @ May 11 2024, 02:26 PM)
and that is what apple is doing extremely right, and at times their low power operation performance is rivaling AMD/Intel high end CPUs which are using 4-5x more power for the same operation. their M4 has just been released recently, with MT performance closing in a 13700K, and ST performance obliterating many, if not all, modern x86 CPUs. a BASE M4 is doing that? at 5x less power? node advantage + SME aside, cannot dismiss what Apple has been doing and they're definitely putting more pressure on AMD/Intel. ESPECIALLY INTEL.

M2/M3 era has already seen the chip performing faster than x86 counterparts in a number of applications such as in Adobe apps, DaVinci Resolve and Handbrake. new M4 era will be another eye opener similar to M2.

with M4 being released this early, Qualcomm is shitting themselves too. I mentioned I had faith previously on X Elite but things do change fast within a month. X Elite is due for >1 year now. after their shoddy X Plus reveal a few weeks ago, rough rumors are saying they're in a very messy situation rn. we'll see how Qualcomm handles this.
hence why x86 is living for backwards compatibility to cater for relic systems. been so long since 8086 era.

however, it really isn't x86 fault for the "slowness" simply because it's just an ISA. a good uarch (microarchitecture, or simply CPU design) will yield great results. Apple has greatly improved their uarch to gain higher frequency, as well as shoving in ARM SME into it, even with small IPC gain (and still yield ~25% improvement over M3).
*
Fully agree with you, without being tied to X86 legacy, Apple could wipe the slate with a new CPU design and they clearly showed what real modern CPU uarch could do current manufacturing process. Both Intel/AMD X86 will need some newfangled complex & expensive SOC layout or exotic materials to push speeds higher and all that just to match even current M3/M4 that is made on the same matured process.

chocobo7779
I dont fully agree on the reason why Apple charges that much, while yes their CPU SOC is much larger the cost to make is not exponentially like what you pay. Apple charges waterfish prices simple bcoz its an Apple. And for the sheer volume their CPU costing per unit isnt that all much different as these will be inside iphones too.
babylon52281
post May 11 2024, 04:14 PM

Look at all my stars!!
*******
Senior Member
2,673 posts

Joined: Apr 2017
QUOTE(chocobo7779 @ May 11 2024, 03:53 PM)
You can also has inefficient ARM cores either if you clocked them to the moon, note that power scales linearly with frequency, and a factor of 2 with voltage icon_idea.gif
This is why high clock speeds can be a bad thing if the process node or the architecture are not built for it (even Intel admits it on their slide deck, see their flat-ish curves on the power scaling chart):

https://download.intel.com/newsroom/2022/cl...nalyst-deck.pdf

But then for some reason, despite the huge advances in x86 efficiency we still have ridiculously inefficient chips because both Intel/AMD practically clocked them up to near unsustainable levels (I mean, why does 5GHz+ ULV mobile chips and 6GHz+ desktop chips exist?)  sweat.gif
*
Well thats the thing. ARM shtick is not power game but power efficiency. Like Netburst vs Core back then, do you want a high clock inefficient chip vs cooler, slower but technically 'faster' chip? The death of Netburst and rise of Core clearly indicates what the market wants.
babylon52281
post May 11 2024, 04:18 PM

Look at all my stars!!
*******
Senior Member
2,673 posts

Joined: Apr 2017
QUOTE(chocobo7779 @ May 11 2024, 04:07 PM)
Anyway let's get back to the topic, we're getting a bit derailed there - the original topic is Intel's 14th gen stability issues, not a discussion between ISA and power efficiency icon_idea.gif
*
More like the original topic was a pissing game between AMD fanbois & Intel fanbois. Both sides are real losers when each have their own hardware issues and there exist better CPU uarch.
babylon52281
post May 11 2024, 04:22 PM

Look at all my stars!!
*******
Senior Member
2,673 posts

Joined: Apr 2017
QUOTE(chocobo7779 @ May 11 2024, 04:18 PM)
Yeah, but that'll require wider, larger core designs, which are not area efficient and will need larger dies unless you want AMD/Intel to cannibalize their far more profitable server/HPC business that is icon_idea.gif
*
Im advocating more towards fully realising ARM uarch to a desktop equivalent or else a new CPU uarch from scratch without the inefficient legacy (hello RISC V?)

3 Pages  1 2 3 >Top
 

Change to:
| Lo-Fi Version
0.0280sec    0.69    7 queries    GZIP Disabled
Time is now: 28th November 2025 - 11:48 PM