Welcome Guest ( Log In | Register )

10 Pages « < 6 7 8 9 10 >Bottom

Outline · [ Standard ] · Linear+

 intel thread, 2021 budget superpowah

views
     
babylon52281
post Oct 11 2024, 12:34 PM

Look at all my stars!!
*******
Senior Member
2,654 posts

Joined: Apr 2017
QUOTE(Skylinestar @ Oct 11 2024, 09:20 AM)
Available in October 24th

Intel has officially announced the release timeline for the Core Ultra 200 series. The formal introduction of the new processors is today October 10th, coinciding with the unveiling of Intel’s new desktop platform featuring the LGA-1851 socket, which replaces the previous LGA-1700 series after three years of use. The market launch of the Core Ultra 200 series is planned for October 24th, thus two weeks after today, aligning with the release of media reviews, this website included.

https://www.guru3d.com/review/intel-announc...rrow-lake-cpus/
*
AMD has moved onto 16 Big Pcores and rumours that 99x0X3D will get Vcache on both CCD, yet Intel still remain 8 Pcores and only increasing the ShitEcores. When will they understand 16 or 24 or 32 or whatever Ecores is useless for games and most apps? Intel been stuck on 8 Pcores since Cometlake like 5 Generations ago!

Moving to chiplet should make it easy to scale up to 10 or 12 Pcores but Intel would rather keep this segment for overpriced Xeon.
babylon52281
post Oct 27 2024, 10:46 AM

Look at all my stars!!
*******
Senior Member
2,654 posts

Joined: Apr 2017
Like in any other major changes, always go for 2nd gen evolution. ARL like Ryzen 1xxx can be quite poor as Intel works out the weaknesses in chiplet tech.
babylon52281
post Nov 4 2024, 11:57 AM

Look at all my stars!!
*******
Senior Member
2,654 posts

Joined: Apr 2017
QUOTE(Maxieos @ Nov 4 2024, 08:35 AM)
Then up till now ,which one should get ? gen 13/14 got issue even with micro code.gen 12 outdated with 1700 socket.
New core ultra all expensive , need new motherboard and ddr5.

Which to choose ? will 1851 socket only 1 generation and get discard ?

Any info on Bartlett Lake P-cores ?
*
If your just gaming, 9800X3D is coming to Msia within this or next mth.
babylon52281
post Nov 5 2024, 08:29 AM

Look at all my stars!!
*******
Senior Member
2,654 posts

Joined: Apr 2017
QUOTE(Maxieos @ Nov 5 2024, 07:01 AM)
Over budget with 2.5k for cpu alone not possible.
Which Gen 12 is worth getting ? I wonder why 13/14th gen never reduce price even it having problem.

Any info on Bartlett Lake P-cores ? since it's for 1700 socket.
*
No competition so AMD knows how to reap the market. Same as peeps say about Intel.

Best 12gen now is 12900/12900K pair with Z690/Z790 mobo. 8 Ecores more than enuff to run Windows & background tasks, going up to 16 Ecores is useless.

According to HUB, Bartlett was cancelled so LGA1700 is truly dead cry.gif

This post has been edited by babylon52281: Nov 5 2024, 08:31 AM
babylon52281
post Nov 6 2024, 11:38 AM

Look at all my stars!!
*******
Senior Member
2,654 posts

Joined: Apr 2017
QUOTE(Maxieos @ Nov 6 2024, 07:06 AM)
Since when Bartlett cancelled ? if Bartlett Lake is also affected by the ring core issue maybe cancel.
Sorry again 2k processor for old 12900 without price drop is not good.

What do you think of 12700 ? no oc but need igpu in case got issue.
*
In their talks HUB hosts mentioned it. Cant remember which segment but its here



You asked which 12th gen to get so of course 12900/K. Reason why its high priced due to 13/14Gen instability pushing prices up. 12700 is Intel best bang for buck right now but since its deadended its better pay for the best. Im still rocking 12700F and when it starts to lag I can still BCLK OC it.

This post has been edited by babylon52281: Nov 6 2024, 12:04 PM
babylon52281
post Nov 7 2024, 10:41 PM

Look at all my stars!!
*******
Senior Member
2,654 posts

Joined: Apr 2017
QUOTE(Maxieos @ Nov 7 2024, 10:25 AM)
Found , I search through the transcript 14:45 maybe really rumors , but why 13/14Gen never price drop despite with instability issue ?
12700 but you need a good Z motherboard which nowhere to be found now ,except from oversea amazon.

Want to ask do you know whether rumors Core Ultra socket 1851 really 1 gen only for arrow lake ?
Also , is there any software to check whether the 13/14Gen purchase having instability issue ?
*
Intel is in financial trouble, why would they want to drop prices? Its also not their way of business. AMD otoh... whistling.gif

12700 is nonK, so no need Z mobo. Just a solid B mobo will do. MSI B760M-A or if want BCLK OC then Mortar Max Wifi or Asrock B760M PG Riptide, or PG Sonic.

Rumours? Yes & no, 50-50.

1st person that comes out a software that can instantly tell you will make shitloads of money instantly. So far dont know anyone that lucky.
babylon52281
post Nov 9 2024, 05:08 PM

Look at all my stars!!
*******
Senior Member
2,654 posts

Joined: Apr 2017
QUOTE(Maxieos @ Nov 8 2024, 08:58 PM)
yeh , after TSMC full price selling to Intel but at lease it should have some impact on the gen13/14 issue on price drop.Maybe only happen in US black Friday.

So which AMD processor equal with 12700 in production base ? though this is Intel thread but intel is no longer an option.A Core Ultra 245k cost above 1.3k https://www.tmt.my/collections?category%5B%5D=processor
or Just grab a AM5 board + processor ?

Want to ask do you know whether rumors Core Ultra socket 1851 really 1 gen only for arrow lake ?
You know Intel way of doing busiess , 1-2 gen only.
*
Black friday nothing effect ours lar unless you go buy from Amazon SG.

Going Core Ultraman means new mobo & DDR5 so more expensive platform than LGA1700 which can still use DDR4.

As I said, Ultraman still too raw, get 9800X3D/7800X3D if purely gaming.
babylon52281
post Nov 11 2024, 08:38 AM

Look at all my stars!!
*******
Senior Member
2,654 posts

Joined: Apr 2017
QUOTE(Maxieos @ Nov 10 2024, 07:22 AM)
LGA 1700= Obsolete
LGA 1851 = Only 1 Gen or 1 refresh (new mobo & DDR5)

AM4 = Obsolete
AM5 = new mobo & DDR5

This year is the worst for Intel , all old gen also cannot upgrade.13/14th issue , new gen expensive.If go for 12th LGA 1700 = stuck forever no upgrade.
Even Ultra 225 I assume will also cost 1k.

So what is the cheapest option ?
*
Just coz its obsolete dont mean its uncompetitive. LGA1700 might be ended but 12700K/12900K still can run nearly all games well esp if you go 2K to 4K where CPU doesnt really matter as its GPU limited.

Ultraman & AM5 still too raw with CUDIMM development. Most early gen mobos will hard to hit 9000-10K speeds of future RAM, so trying to future proof is stupid. Just get whichever is best for your money right now.

EDIT
To put into context, on a grand scale here. If one were to look in TPU review charts
https://www.techpowerup.com/review/amd-ryze...9800x3d/20.html

The supposed CPU king of gaming is only difference from "obsoleted" 12700K by 10% (in 2K) and mere 3% (in 4K). Like just 3% FPS, between Intel best bang vs AMD super highly scalped not gonna come soon king CPU if you are already high end gamer playing at 4K. And this is just using 4090 to bring out CPU difference, if using slower GPU surely the CPU gap is much less.

So no need to obsess which CPU gen is better, esp if gaming 4K, just buy which you can afford. A difference in GPU segment will give a bigger boost than CPU.

This post has been edited by babylon52281: Nov 11 2024, 05:34 PM
babylon52281
post Nov 17 2024, 09:50 AM

Look at all my stars!!
*******
Senior Member
2,654 posts

Joined: Apr 2017
QUOTE(Maxieos @ Nov 16 2024, 09:26 PM)
Which motherboard to get ? PL1=PL2 ?
*
If for i5 to nonK i7, either MSI Pro B760M-A (must be -A), Asrock B760M PG Riptide or PG Sonic, for Asus is Prime -A
babylon52281
post Dec 3 2024, 12:50 PM

Look at all my stars!!
*******
Senior Member
2,654 posts

Joined: Apr 2017
Core2Duo turnaround might never happen again, heck even Sandy bridge level of success might never happen again. C2D was such a hit bcoz Pentium4/D uarch was such a fail that it could barely compete with AMD64.

This unlike Core 13/14gen which can still put up a fight with latest Ryzens. Core Ultra was their Ryzen 1st gen moment, recall that nearly everyone was saying that was a bad idea it was bad in games & apps, it was buggy in Windows. AMD is a lot smaller so they could stuck to it and their CEO as the uarch has potential, see what it became today. Core Ultra wont turnaround magically like C2D, its more like needing few evolution cycles to get full potential like Ryzen (or Pokemon haha) but since Intel is too big to fail, they have to ditch a CEO under transition.

Well its also the last big player still with a caucasian CEO, now likely will change to an In---n CEO like so many others. Or Jensen? Hahaha!
babylon52281
post Dec 4 2024, 09:49 AM

Look at all my stars!!
*******
Senior Member
2,654 posts

Joined: Apr 2017
QUOTE(kingkingyyk @ Dec 4 2024, 08:26 AM)
The B570/B580 looks interesting to give Jensen a lesson on serving underspec-ed 3050 / 4060. brows.gif

Might get one if review turned out good.
*
Intel Arc issue isnt mainly hardware (actually its quite good for their classes), but its the software thats disappointing. Driver support is lagging and Intel culture of being slow response to fix issues which NVIDIA leads by miles with next day driver hotfixes. Intel needs to realise they now have to be a software company as much as a hardware one previously.

QUOTE(overfloe @ Dec 4 2024, 08:42 AM)
Probably one of Jensen's & Dr Lisa's relatives will step in  devil.gif

one big happy family to conquer the chips world
*
LOL TSMC been trying to court Jensen, maybe he will move over and let his wife run Nvidia, then his long lost cousin can take over Intel. ARM is also in trouble with biggest customer Qualcomm so another Huang family can come in too? Hehe

The Huang semi-con-glomerate.
babylon52281
post Dec 4 2024, 10:18 AM

Look at all my stars!!
*******
Senior Member
2,654 posts

Joined: Apr 2017
I will wait for Intel to come out 6th & 7th series Arc, then market will confuse B660, B760 GPU with B660, B760 mobo. Haha
babylon52281
post Dec 4 2024, 12:10 PM

Look at all my stars!!
*******
Senior Member
2,654 posts

Joined: Apr 2017
https://www.techspot.com/news/105810-intel-...570-gaming.html

Battlemage is out but still wrapping my head around Intel nomenclature.

If I understand correctly, would be:
B - Battlemage family/generation ie R7000 or RTX 5000
5 - tier segment meaning shd be midrange ie R x700/x800 or RTX xx60/xx70 (it seems there is a '7' family for higher segment?)
70/80 - I think would be upper/lower segment similar to x700/x800 naming? Or perhaps upper/lower tier of same GPU family ie nonTi & Ti variant?
LE? - There might also be an LE variant to indicate an upper tier within same segment ie TI/Super?

This post has been edited by babylon52281: Dec 4 2024, 12:11 PM
babylon52281
post Dec 4 2024, 03:00 PM

Look at all my stars!!
*******
Senior Member
2,654 posts

Joined: Apr 2017
QUOTE(kingkingyyk @ Dec 4 2024, 01:27 PM)
And gone is the Arc Control. There will be a new UI replacing it, but who knows what lies in the backend.
Easier if they make it a 8GB variant and sell it at $199. Instant KO to 6600XT / 3050 6GB. biggrin.gif
*
With all the hate 8GB GPUs gets its no wonder that Intel would prefer sidestep that minefield. Also 8GB is now even below low end for smooth gameplay so if Intel wants a better customer experience it would do better to get just slightly up with the 10GB VRAM while the full price 12GB is for the more expensive B580.

Intel can still cut down to 5/6GB cards if later target low end. But they really need a 4070 class fighter if want to progress in marketshare.
babylon52281
post Dec 4 2024, 07:00 PM

Look at all my stars!!
*******
Senior Member
2,654 posts

Joined: Apr 2017
QUOTE(kingkingyyk @ Dec 4 2024, 03:19 PM)
There is nothing wrong with 8GB provided it is priced nicely. Who doesn't like it when you can get it for RM899? wink.gif

NVIDIA/AMD have been feeding us leftovers for $159 to $199 range. RTX3050 6GB?  puke.gif 6500XT PCI-E x4 without media engine?  puke.gif
*
At this current era? Such a low end GPU would be very poor gaming even at low settings. Its not the VRAM then, its the GPU doesnt have enough legs. Good for HTPC tho, but I think Intel would rather avoid this segment if the focus of dGPU is on gaming so they must ensure even baseline GPU can somewhat play smoothly.

Personally I wouldnt consider 3050/6500 to be a gaming GPU. Their specs are way too castrated to effectively be one even if they have 'Gaming' on their names & RGB to boost FPS.
babylon52281
post Dec 7 2024, 12:55 PM

Look at all my stars!!
*******
Senior Member
2,654 posts

Joined: Apr 2017
QUOTE(chocobo7779 @ Dec 7 2024, 08:20 AM)
Both GPUs are kinda irrelevant when you consider that the RX6600 exist, where it has been in the $199 range for quite some time icon_idea.gif

Knowing Nvidia's mindset, I'm pretty sure that the 3050 will sell much more than the 6600, despite the latter being substantially faster than the former icon_idea.gif
*
RX6600 in the same trash tier as those both. 8GB, PCIE X8.

Not just Nvidia but both brands are pushing shit tier GPU no better than an APU level to consumers calling it gaming, is an insult to those who dont have much money. Its like them telling people "if your too peasant poor to buy a 4070/7800 for proper gaming, please dont game." Which is why people go for consoles, alas even Sony has been upping PS prices also telling people "stop being poor if you want to game!"
babylon52281
post Dec 8 2024, 11:06 AM

Look at all my stars!!
*******
Senior Member
2,654 posts

Joined: Apr 2017
QUOTE(kingkingyyk @ Dec 7 2024, 03:40 PM)
With much better RT in B570...  biggrin.gif AMD will have a headache for a few months before 8600.
Guess NVIDIA still has not fulfilled the capability of Samseng, so 3050 is the solution to clear the remaining. laugh.gif
This is the idea, but To be fair 3050 6GB is still better than 8700G.  laugh.gif If they can sell the 3050 6GB at $149, APU no longer is attractive. 6/8GB VRAM is still fine provided the price is low enough to undercut APU.
*
Nobody in midtier will care that much about RT if it means killing perf <60FPS. At such level, raster rate should be more important which AMD holds better than Nvidia. Intel Arc will need to beat AMD on that.

If 3050 carries onto the next generation, I see Nvidia trying to turn it into the next GT730/710 where it was "legendary" for staying that long in sales as a base GPU meant just to output a monitor. It was only taken out coz driver support no longer available.

APUS are getting better by generation while 3050 is getting worse by iteration. I wont surprise if next see 3050 64bit or PCIE Gen3 only.
babylon52281
post Dec 8 2024, 11:15 AM

Look at all my stars!!
*******
Senior Member
2,654 posts

Joined: Apr 2017
QUOTE(chocobo7779 @ Dec 7 2024, 04:12 PM)
?  PCIe 4.0 x8 is still fine even on 3.0, and 8GB is still plenty doable if you manage game settings well (I've gamed on 1440p with a 8GB card)

Meh, considering that high end graphics = expensive game development costs these days which basically means 'safe' game designs, I've long moved away from games with fancy graphics and played something that feels more game like whistling.gif
I'd much more prefer indies/semi-niche JRPGs over fancy Hollywood stuff anyway icon_idea.gif

Consoles?  That's a good one if you want nothing but Hollywood movies with 'interactivity' being peppered in (Nintendo notwithstanding, that's why my PS4 is pretty much a dust collector at the moment) sweat.gif

Maybe I'm in the minority, but the gaming industry might be better off making 'worse' looking games if it does translate to a better game - but this will certainly put hardware manufacturers to bankruptcy and kill Moore's Law for good if the gaming industry did just that icon_idea.gif
*
PCIE 4 x8 is only looks ok bcoz the GPU is pisspoor to really make use of X16. If Nvidia/AMD really give a GPU thats value for money perf that x8 bandwidth will throttle the GPU.

Hollywood movies with 'interactivity'? But thats what GenZ gamers wants. https://www.techspot.com/news/105808-people...games-than.html

Cant blame developers for giving what the market wants rite? Very soon AAA games will become visual novels minus the echi bits coz people are becoming lazier.
babylon52281
post Dec 12 2024, 07:59 AM

Look at all my stars!!
*******
Senior Member
2,654 posts

Joined: Apr 2017
QUOTE(pot @ Dec 10 2024, 10:09 PM)
Hi is 12700F around RM800 for new is worth to buy?? building a new pc in this month? Should go i7 14700f? around 1.1k? Thank you
*
Right now 12Gens value for money and does not have degradation issue. 13 & 14Gens even nonK seems like got affected altho Intel says its fixed after bios 0x129 microcode. Both have 8 Pcores only diff is later gens got moar useless Ecores.

If your use not thread dependent & that 500mhz higher cannot feel then 12Gen is better & less problems. Just go with a mobo that can unleash PL2 limit.
babylon52281
post Dec 13 2024, 10:54 AM

Look at all my stars!!
*******
Senior Member
2,654 posts

Joined: Apr 2017
QUOTE(pot @ Dec 12 2024, 10:32 PM)
Just building a mainstream Gaming pc wiht rtx 4060. Too bad cant get  12gen 12700F for 800. Might go 14400.
14 gen 6 core can on par wiht 12 gen 12700 series? Thank you for reply.
*
You gotta be fast if want flash sales.

In some apps it will be same or better if compare on MHZ but if thread count 12700 might have edge as 2 extra faster Pcores vs more slower Ecores.

14Gen i5 will do fine for 95% cases, dont worry to get if theres good deals.

10 Pages « < 6 7 8 9 10 >Top
 

Change to:
| Lo-Fi Version
0.0950sec    0.23    7 queries    GZIP Disabled
Time is now: 26th November 2025 - 12:12 AM