Welcome Guest ( Log In | Register )

11 Pages « < 8 9 10 11 >Bottom

Outline · [ Standard ] · Linear+

 Intel 13th/14th gen cpus crashing, degrading

views
     
dexeric
post Aug 9 2024, 10:18 AM

Getting Started
**
Junior Member
118 posts

Joined: Oct 2008


QUOTE(babylon52281 @ Aug 9 2024, 09:29 AM)
Poorly optimised software exist bcoz they know they can leverage on high power CPUS, high power CPU exist bcoz existing hardware uarch are poorly optimised for power efficiency & ipc, poorly optimised uarch exist bcoz back then the 70s & 80s (when X86 was rising), power efficiency wasnt a thing.

AFAIK I havent heard of a poor Apple or Android/Arm software that sucks more power than needed to run it. I stand to be corrected.

For X86 uarch to move forward, it needs to ditch legacy 16bit & 32bit function, and revamp its 64bit compute to be on par with SD/ Mseries.

Intel problem in this tered is symptomatic of X86 limitations due to pushing power limits when they hit the down nodeing wall. AMD 9000series is also indicative of this, perhaps moreso of the future, when they decided to pull back the TDP limit to 65W, as i suspected seeing TPU review with PBO off there was a little 5% more to gain so AMD too have hit a power performance wall with current uarch & node.
*
Not sure why you are comparing desktop part to mobile part, to compare should be xelite to amd ai 9 300 series.
Most of the time it show a bit difference in performance and power but it is not big difference.

Regarding the uarch, you cannot run 16bit and 32bit function in windows 10 64. So why point out this. Just because the instruction set still support that, it does not mean it is still used in pc.

Heck 16bit not even supported in now...
Instruction set

AMD64 (x86-64) (AMD64 only support 32bit and 64bit)
Extensions
Crypto AES, SHA
SIMD MMX-plus, SSE, SSE2, SSE3, SSE4.1, SSE4.2, SSE4A, SSSE3, FMA3, AVX, AVX2, AVX512
Virtualization AMD-V

The only problem i see in this ARM vs x86 pc is RISC vs CISC design which is only affect by the legacy design of the operating system since the instruction set are different.


This post has been edited by dexeric: Aug 9 2024, 10:19 AM
babylon52281
post Aug 9 2024, 10:46 AM

Look at all my stars!!
*******
Senior Member
2,681 posts

Joined: Apr 2017
QUOTE(dexeric @ Aug 9 2024, 10:18 AM)
Not sure why you are comparing desktop part to mobile part, to compare should be xelite to amd ai 9 300 series.
Most of the time it show a bit difference in performance and power but it is not big difference.

Regarding the uarch, you cannot run 16bit and 32bit function in windows 10 64. So why point out this. Just because the instruction set still support that, it does not mean it is still used in pc.

Heck 16bit not even supported in now...
Instruction set

AMD64 (x86-64) (AMD64 only support 32bit and 64bit)
Extensions
Crypto AES, SHA
SIMD MMX-plus, SSE, SSE2, SSE3, SSE4.1, SSE4.2, SSE4A, SSSE3, FMA3, AVX, AVX2, AVX512
Virtualization AMD-V

The only problem i see in this ARM vs x86 pc is RISC vs CISC design which is only affect by the legacy design of the operating system since the instruction set are different.
*
Ryzen HS and Core HX CPUS are mobile parts no? But they share the same uarch as desktop parts and these are thermally hard to cool unlike SD Elite.

Just bcoz its not used in Windows doesnt mean the hardware to run is not ady there


But its precisely that since Windows no longer interact at such base algo that it makes no sense to keep them, so why not deprecate and remove 16 & 32bit function. Then by optimising 64bit, the transistor saved can be reuse to improve elsewhere or cut the die size down to reduce cost.

And its not like the end for legacy programs as software emulation could be used to run them.

This post has been edited by babylon52281: Aug 9 2024, 10:48 AM
dexeric
post Aug 9 2024, 10:58 AM

Getting Started
**
Junior Member
118 posts

Joined: Oct 2008


QUOTE(babylon52281 @ Aug 9 2024, 10:46 AM)
Ryzen HS and Core HX CPUS are mobile parts no? But they share the same uarch as desktop parts and these are thermally hard to cool unlike SD Elite.

Just bcoz its not used in Windows doesnt mean the hardware to run is not ady there


But its precisely that since Windows no longer interact at such base algo that it makes no sense to keep them, so why not deprecate and remove 16 & 32bit function. Then by optimising 64bit, the transistor saved can be reuse to improve elsewhere or cut the die size down to reduce cost.

And its not like the end for legacy programs as software emulation could be used to run them.
*
https://en.m.wikipedia.org/wiki/FreeDOS

Free dos is 32 bit not 16 bit.

32 bit support is not as big deal as you think...
It might took space but it doesn't hinder performance as the part was not used when it is not used...
The real problem is the operating system. U try run amd processor in linux... and run arm processor in linux... still amd will have better performance there.

Plus there is huge difference in lithography between amd hs processor and sd x elite processor. Better lithography increase efficiency and performance by alot.

1024kbps
post Aug 9 2024, 03:28 PM

李素裳
*******
Senior Member
6,013 posts

Joined: Feb 2007



QUOTE(dexeric @ Aug 9 2024, 10:18 AM)
Not sure why you are comparing desktop part to mobile part, to compare should be xelite to amd ai 9 300 series.
Most of the time it show a bit difference in performance and power but it is not big difference.

Regarding the uarch, you cannot run 16bit and 32bit function in windows 10 64. So why point out this. Just because the instruction set still support that, it does not mean it is still used in pc.

Heck 16bit not even supported in now...
Instruction set

AMD64 (x86-64) (AMD64 only support 32bit and 64bit)
Extensions
Crypto AES, SHA
SIMD MMX-plus, SSE, SSE2, SSE3, SSE4.1, SSE4.2, SSE4A, SSSE3, FMA3, AVX, AVX2, AVX512
Virtualization AMD-V

The only problem i see in this ARM vs x86 pc is RISC vs CISC design which is only affect by the legacy design of the operating system since the instruction set are different.
*
32bit is still widely used and most programs ARE 32 bit..,

16 bit is hecking slow but old game runs with dosbox you dont really feel sluggish.
before that you still can run 16bit applications but MS removed the NTVDM on newer windows

user posted image
we're running mixed 32 and 64 bits, some programs still default on 32 due to devs dont see the benefits to compile them exclusively to 64 bits.
1024kbps
post Aug 9 2024, 03:39 PM

李素裳
*******
Senior Member
6,013 posts

Joined: Feb 2007



QUOTE(dexeric @ Aug 9 2024, 10:58 AM)
https://en.m.wikipedia.org/wiki/FreeDOS

Free dos is 32 bit not 16 bit.

32 bit support is not as big deal as you think...
It might took space but it doesn't hinder performance as the part was not used when it is not used...
The real problem is the operating system. U try run amd processor in linux... and run arm processor in linux... still amd will have better performance there.

Plus there is huge difference in lithography between amd hs processor and sd x elite processor. Better lithography increase efficiency and performance by alot.
*
Linux is very scalable, most routers runs on Linux, microcontrollers, also linux powered...
then at my work place my company use Win10 IOT LTSC on some atom processor, shit is hecking slow lol, it should have been on linux.

For ARM processor, perhaps android devices and apple products are best example that Ios and Android (Unix like, and latter is linux) runs very well with ARM cpus (apple sillicon/Qualcomm SD), because they're optimized to run on battery
ARM can be found on supercomputer to mobile devices.
1024kbps
post Aug 9 2024, 03:40 PM

李素裳
*******
Senior Member
6,013 posts

Joined: Feb 2007



QUOTE(Baconateer @ Aug 8 2024, 06:48 PM)
id software developed Doom

Bethesda is the publisher

bethesda can only develop something like starfield..

which is an abomination when it comes to optimisation when compared to doom
*
yeah lol, i always mix them up

anyway id tech engine are top tier game engine, for Wolfenstein 2, i can max out everything on 4K on 60FPS on my ancient vega64... not many games can do that.
too bad not much games use the id tech, hopefully newer games will use it.

This post has been edited by 1024kbps: Aug 9 2024, 03:45 PM
1024kbps
post Aug 9 2024, 03:51 PM

李素裳
*******
Senior Member
6,013 posts

Joined: Feb 2007



By the way we shifted the topic too far away, Intel do provide extended warranty for the CPU,
business ethic wise Intel isnt very good, AMD is not better but hopefully it wont go bankrupt,
we finally have Intel GPU and i can see some developer added the Intel GPU exclusive function to games, eg Cyberpunk 2077.

Can't let the AMD/nVidia duopoly to dominate again as without competition the gpu price will always go wild
babylon52281
post Aug 9 2024, 05:47 PM

Look at all my stars!!
*******
Senior Member
2,681 posts

Joined: Apr 2017
QUOTE(1024kbps @ Aug 9 2024, 03:51 PM)
By the way we shifted the topic too far away, Intel do provide extended warranty for the CPU,
business ethic wise Intel isnt very good, AMD is not better but hopefully it wont go bankrupt,
we finally have Intel GPU and i can see some developer added the Intel GPU exclusive function to games, eg Cyberpunk 2077.

Can't let the AMD/nVidia duopoly to dominate again as without competition the gpu price will always go wild
*
We did run out from topic but the key thing is; Intel(mobo partners) screwed up with 13/14 Gen, AMD screwed up with X3D cooking itself and 9000series flops, we really need a 3rd CPU gamechanger and so far ARM/SD is the best bet to push CPU evolution to the next stage. Unless someone else comes up with quantum computing that fits into a 2in x 2in square.

Ohh and then theres China but lets not talk about them.

This post has been edited by babylon52281: Aug 9 2024, 05:49 PM
1024kbps
post Aug 9 2024, 09:20 PM

李素裳
*******
Senior Member
6,013 posts

Joined: Feb 2007



QUOTE(babylon52281 @ Aug 9 2024, 05:47 PM)
We did run out from topic but the key thing is; Intel(mobo partners) screwed up with 13/14 Gen, AMD screwed up with X3D cooking itself and 9000series flops, we really need a 3rd CPU gamechanger and so far ARM/SD is the best bet to push CPU evolution to the next stage. Unless someone else comes up with quantum computing that fits into a 2in x 2in square.

Ohh and then theres China but lets not talk about them.
*
one chasing raw performance,
the other chasing huge cache
then you have application that don't really use the cpu instructions plus resource hogging Windows OS

for ARM qualcomm and MS need to work out binary translator for x86 app on ARM CPU, similar to Apple Rosetta stone
https://en.wikipedia.org/wiki/Rosetta_(software)

Welp, if apple can do it, MS and Qualcomm can too, hopefully wont be suck
XeactorZ
post Aug 10 2024, 05:36 PM

♥ PandaDog ♥
*********
All Stars
31,608 posts

Joined: Aug 2010
QUOTE(1024kbps @ Aug 9 2024, 03:51 PM)
By the way we shifted the topic too far away, Intel do provide extended warranty for the CPU,
business ethic wise Intel isnt very good, AMD is not better but hopefully it wont go bankrupt,
we finally have Intel GPU and i can see some developer added the Intel GPU exclusive function to games, eg Cyberpunk 2077.

Can't let the AMD/nVidia duopoly to dominate again as without competition the gpu price will always go wild
*
meanwhile waiting MSI to release their bios update laugh.gif
1024kbps
post Aug 12 2024, 01:37 AM

李素裳
*******
Senior Member
6,013 posts

Joined: Feb 2007



QUOTE(XeactorZ @ Aug 10 2024, 05:36 PM)
meanwhile waiting MSI to release their bios update laugh.gif
*
Performance impact mesured after microcode upate
https://www.phoronix.com/review/intel-raptor-lake-0x129

OOF size: large
babylon52281
post Aug 12 2024, 10:32 AM

Look at all my stars!!
*******
Senior Member
2,681 posts

Joined: Apr 2017
For 13/14 Gen users, be aware there are TWO microcodes to install;
0x125 and the latest 0x129.
You should install both in sequence.

Why is, Intel has defined each code to perform different function on the volt setting, tho I am unsure if the later code has the fixes of the earlier release so just to be safe better to flash both into your bios.
babylon52281
post Aug 14 2024, 04:16 PM

Look at all my stars!!
*******
Senior Member
2,681 posts

Joined: Apr 2017
First review of the microcode is in and at least this utuber said it doesnt fix the issue


It seems VDD is higher than spec, but Im not sure if VDD was the killer overvolt. Owners will still need to check their selves if still degrades post update, then its an epyc fail from Intel yet again (whats with these brands competing how to fail harder, et tu AMD?)

My advice would still remain same as when the issue blew up, dont trust them, even with the update do manually set limits to Intel specified on their product page:
https://www.intel.com/content/www/us/en/pro...ifications.html
https://www.intel.com/content/www/us/en/pro...ifications.html
https://www.intel.com/content/www/us/en/pro...ifications.html
https://www.intel.com/content/www/us/en/pro...ifications.html

No point trying to push it down to the limit, its ady at the limit.
moiskyrie
post Aug 19 2024, 08:09 AM

Look at all my stars!!
*******
Senior Member
3,217 posts

Joined: Dec 2006
From: City of Neko~~Nyaa~
No wonder my office designer new desktop from lenovo
(13th i7) crash and need to change mobo, cpu and gpu....
I think cpu change 2 time...

Now want buy new desktop,
14400 got affected?
montaguespirit
post Aug 19 2024, 08:58 AM

Getting Started
**
Junior Member
81 posts

Joined: Apr 2015
So far no issue in my experience because we don't overclocking it. Every generation of processor sure have minor issue but it will take time for Intel/AMD and motherboard manufacturer to fix it.
babylon52281
post Aug 19 2024, 09:20 AM

Look at all my stars!!
*******
Senior Member
2,681 posts

Joined: Apr 2017
QUOTE(moiskyrie @ Aug 19 2024, 08:09 AM)
No wonder my office designer new desktop from lenovo
(13th i7) crash and need to change mobo, cpu and gpu....
I think cpu change 2 time...

Now want buy new desktop,
14400 got affected?
*
If need to change mobo & GPU, your got a different serious problem leh. CPU might be related but I dont see how it could have kill your GPU...

Once you got the replacement, do update both the microcodes (and any if got later), the set TDP limit to Intel PBP, then set VDD limit according to the video posted above. I think after this should be stable unless something else was discovered.
moiskyrie
post Aug 19 2024, 10:14 AM

Look at all my stars!!
*******
Senior Member
3,217 posts

Joined: Dec 2006
From: City of Neko~~Nyaa~
QUOTE(babylon52281 @ Aug 19 2024, 09:20 AM)
If need to change mobo & GPU, your got a different serious problem leh. CPU might be related but I dont see how it could have kill your GPU...

Once you got the replacement, do update both the microcodes (and any if got later), the set TDP limit to Intel PBP, then set VDD limit according to the video posted above. I think after this should be stable unless something else was discovered.
*
The lenovo technician also don't know what happen as after change few time the hardware also still bsod.....
I think first time is change new cpu...
After that gpu...
3rd time cpu and mobo....
First cpu can work for few day and start bsod again....
Even now also sometime still bsod....
babylon52281
post Aug 19 2024, 10:34 AM

Look at all my stars!!
*******
Senior Member
2,681 posts

Joined: Apr 2017
QUOTE(moiskyrie @ Aug 19 2024, 10:14 AM)
The lenovo technician also don't know what happen as after change few time the hardware also still bsod.....
I think first time is change new cpu...
After that gpu...
3rd time cpu and mobo....
First cpu can work for few day and start bsod again....
Even now also sometime still bsod....
*
Could be something else related; RAM or storage? Or power supply?
montaguespirit
post Sep 17 2024, 10:42 AM

Getting Started
**
Junior Member
81 posts

Joined: Apr 2015
QUOTE(moiskyrie @ Aug 19 2024, 10:14 AM)
The lenovo technician also don't know what happen as after change few time the hardware also still bsod.....
I think first time is change new cpu...
After that gpu...
3rd time cpu and mobo....
First cpu can work for few day and start bsod again....
Even now also sometime still bsod....
*
Back in the days of Windows Vista, I was working in Dell. We received a complaint from end user (customer), and it has similar issue. Basically, the onsite technician changed every single parts already still it has BSOD issue. After that, the customer service decide to exchange a brand new unit for the customer and it solved the problem. So basically nobody know what happen too.
Duckies
post Feb 9 2025, 02:51 PM

Rubber Ducky
*******
Senior Member
9,789 posts

Joined: Jun 2008
From: Rubber Duck Pond


Just to share I finally switch out the backplate to the Thermalright one. It helps about 5-10c! Previously I underclock and undervolt it by setting PL1 to 125w and PL2 to 180w before it gets thermal throttling but now it can go up to 200-220w for PL2! Not that it matters much in day to day usage or gaming but just good to see the temp gets good and since the backplate is not that expensive. But I still couldn't get it to 253W cause it'll get thermal throttling half way. Room ambient temperature probably about 26c since no aircon.

This post has been edited by Duckies: Feb 9 2025, 02:53 PM

11 Pages « < 8 9 10 11 >Top
 

Change to:
| Lo-Fi Version
0.0367sec    0.39    6 queries    GZIP Disabled
Time is now: 6th December 2025 - 06:09 AM