Intel Core Ultra thoughts?
Hi,

I've had a pet project of building a general purpose PC mainly for non-gaming uses, e.g. hobby Unreal Engine development and such that will remain useful for some time.

In my country (UK) the Intel 7 265k has been discounted to £285. I wondered if it's worth a try? I've budgeted £400-500 for a new motherboard and decent amount of RAM.

I've looked at a few reviews on youtube and the 265k is sometimes compared to the Ryzen 7 7700X while other reviews it's compared with the R9 9900X. From what I've gotten from the reviews the Ryzen 7 is comparable with the intel cpu in gaming performance and similar pricing in UK. However, the productivity performance is comparable to the 9900X but with a price difference of £100.
< >
Showing 1-15 of 36 comments
C1REX 17 Jun @ 1:59pm 
Originally posted by carl:
In my country (UK) the Intel 7 265k has been discounted to £285. I wondered if it's worth a try? I've budgeted £400-500 for a new motherboard and decent amount of RAM.
I don't own the CPU and have probably seen the same reviews but I would say it's worth a try.
Especially now, with some lower prices and BIOS updates that seem to help with performance comparing to the initial reviews about 7 months ago.
Why not just go with Ryzen 7800X3D or 9800X3D. Especially if your primary aim is gaming
Monk 17 Jun @ 4:35pm 
Originally posted by Bad 💀 Motha:
Why not just go with Ryzen 7800X3D or 9800X3D. Especially if your primary aim is gaming

Try re-reading the post, it's mainly for non gaming and unreal engine development.

Given how various programs prefer one over the other. I'd guess their programs of choice favour Intel.

If they don't, they should buy which ever the primary use of the system favours.
_I_ 17 Jun @ 4:46pm 
wait a few gens before moving back to intel for gaming
current ultra have e cores

u7 265k has 8 p cores + 12 e cores, none with ht

next gens will drop e cores and have better overall core performance
where i counts with games

edit:
i read it the way bad did

8 p cores and lots of e cores could be fine for dev or work like that
Last edited by _I_; 17 Jun @ 4:48pm
Monk 17 Jun @ 4:54pm 
Intel's next chip could great for gaming due out in Q3, I have a feeling Bartlet lake S is going to be a monster gaming cpu.

12 P cores, no e cores, I haven't used LM on my direct die 14900ks because it will make it easier to upgrade if it turns out to be a beast.
If it's for multi-threaded productivity where you'll be using software that uses as many cores as it can get, then the 265K is a good value option.

But if you'll also be doing gaming and/or not using all of the cores, it loses its appeal. I don't know if Unreal Engine development uses as many cores as it can or not, so someone else will have to answer that.
TSMC also has a factory in Arizona, USA and just output 2nm Silicon Wafers for Apple, AMD and NVIDIA. This should help improve things from those brand as well. Intel is behind at this point.
Originally posted by Bad 💀 Motha:
Why not just go with Ryzen 7800X3D or 9800X3D. Especially if your primary aim is gaming

Did you even bother reading? OP said for non gaming. Why do you always do this?
Originally posted by Monk:
Intel's next chip could great for gaming due out in Q3, I have a feeling Bartlet lake S is going to be a monster gaming cpu.

12 P cores, no e cores, I haven't used LM on my direct die 14900ks because it will make it easier to upgrade if it turns out to be a beast.

Thats what people keep claiming. "Intel will magically be the best again this generation you guys just wait!".

How about waiting until they actually become good again? Because as it stands intel has been falling farther behind each release cycle.
Monk 17 Jun @ 7:15pm 
Originally posted by The_Abortionator:
Originally posted by Monk:
Intel's next chip could great for gaming due out in Q3, I have a feeling Bartlet lake S is going to be a monster gaming cpu.

12 P cores, no e cores, I haven't used LM on my direct die 14900ks because it will make it easier to upgrade if it turns out to be a beast.

Thats what people keep claiming. "Intel will magically be the best again this generation you guys just wait!".

How about waiting until they actually become good again? Because as it stands intel has been falling farther behind each release cycle.

That is why I said 'could', not will like AMD fans have done for 2 decades before finally being correct with the 9800x3, still waiting on the gpu front sadly.

On paper, you have to admit it does sound good and these aren't mythical unseen tech, we are talking about known good cores, just more of them and hopefully faster, as a follow on from the 14900ks, that is, in theory a basis for a monster cpu.
Originally posted by Monk:
Originally posted by The_Abortionator:

Thats what people keep claiming. "Intel will magically be the best again this generation you guys just wait!".

How about waiting until they actually become good again? Because as it stands intel has been falling farther behind each release cycle.

That is why I said 'could', not will like AMD fans have done for 2 decades before finally being correct with the 9800x3, still waiting on the gpu front sadly.

On paper, you have to admit it does sound good and these aren't mythical unseen tech, we are talking about known good cores, just more of them and hopefully faster, as a follow on from the 14900ks, that is, in theory a basis for a monster cpu.


Saying could doesn't give people a pass to follow it up with stupid unrealistic nonsense.

AMD "could" release a card that only take 60w of power while being 4x the performance of a 5090 but that doesn't make it likely.

Also your memory of AMD/Intels history seems to be waaay off. AMD only really fell too far behind on the release of the first core "I" series. Before that they were pretty competitive and not long before that they were in the lead.

Infact AMD's dark days only stretch from 2010/2011ish to 2016 so 6 years of uselessness is a far cry from 20 years.

Not to mention they became competitve with the 3000 chips and started becoming the better choice with the 5000 chips especially with the 5800x3d which was before the 78003xd which both came out before the 9800x3d.

So you seem WAAAAAAYY out of touch with reality. Not sure what tiktok video made you think this way but its wrong.
Monk 17 Jun @ 10:17pm 
Fx was a disaster and I swapped from a phenom 2 965 Be to a 3570k around that time, so, 13 years ago-ish, before that they traded blows pretty well, so, wqh 'decades' might of been a bit over stating it.

3000 was bad, 5800x3d was slower than a well setup 9900k and beaten soundly by a 10900k, 7800x3d was getting there, but still far too many stutters and it was largely a toss up between which side had the higher max fps, but Intel had the smoother experience.

The current 9000 series are the first ones with clearly better performance and frequency of dips that aren't an embarrassment to suffer through, so I think I'm pretty correct on my time line.

Beyond that, what part of what I said is unrealistic?

If they are releasing a monolithic 12 performance core no efficency core chip based on raptor lake, it's not unrealistic that they would base it on the best one, so the 14900ks, with extra development and no e cores using up power and producing heat it's not crazy to expect it to hit 6.2 to 6.4GHz, pair it with 12 cores vs the 14900ks' 8 and that sounds like a realistic beast of a chip to me, if the memory controller can riay handle 8000MHz or faster memory which isn't inconceivable, sounds fairly realistic and based in reality unlike what half the Internet dreams up what the next amd chip or gpu will pull off.

I never claimed it would be the best either and I do believe you should take rumours with a grain of salt, but nothing about this seems outlandish.

As always, I don't care who makes it, I just want the best chip for my use case, if it's a dud or doesn't show, no big deal the 14900ks will do just fine.
Originally posted by Monk:
Fx was a disaster and I swapped from a phenom 2 965 Be to a 3570k around that time, so, 13 years ago-ish, before that they traded blows pretty well, so, wqh 'decades' might of been a bit over stating it.

3000 was bad, 5800x3d was slower than a well setup 9900k and beaten soundly by a 10900k, 7800x3d was getting there, but still far too many stutters and it was largely a toss up between which side had the higher max fps, but Intel had the smoother experience.

The current 9000 series are the first ones with clearly better performance and frequency of dips that aren't an embarrassment to suffer through, so I think I'm pretty correct on my time line.

Beyond that, what part of what I said is unrealistic?

If they are releasing a monolithic 12 performance core no efficency core chip based on raptor lake, it's not unrealistic that they would base it on the best one, so the 14900ks, with extra development and no e cores using up power and producing heat it's not crazy to expect it to hit 6.2 to 6.4GHz, pair it with 12 cores vs the 14900ks' 8 and that sounds like a realistic beast of a chip to me, if the memory controller can riay handle 8000MHz or faster memory which isn't inconceivable, sounds fairly realistic and based in reality unlike what half the Internet dreams up what the next amd chip or gpu will pull off.

I never claimed it would be the best either and I do believe you should take rumours with a grain of salt, but nothing about this seems outlandish.

As always, I don't care who makes it, I just want the best chip for my use case, if it's a dud or doesn't show, no big deal the 14900ks will do just fine.


Bro what? You just wrotew a bunch of broken fanfiction and asked what was unrealistic?

The 3000 series wasn't bad and in no way shape or form was a 5800x3d slower than a 9900k PERIOD. Infact the 5800x3d was such a beast that Intel tried to hide it in their own performance graphs with the release of the 13900k.

And no the 7800x3d didnt have magic stutters, you really gotta stop watching tiktiok

Also no the 14th gen is not going to be the base for new chips. Its design was "shove a bunch of watts in" and look how that turned out. Having 12 overheating cores wouldn't magicallly give them better performance.

It also would give then upto 80% more gaming performance which is what they'd need to compete.
Monk 17 Jun @ 10:44pm 
My, two, admittedly, good 9900k chips which were both delidded and direct die used to beat or match pretty much any 5800x3d numbers I saw, though, I'll admit, my experience there probably did influence my perception as most chips wouldn't run at 5.3GHz all core with 4400 memory, so I'll accept that critique.

The 10900k and 13900k could both spank it though, although, as is the annoying way with intel, you have to tinker with them to get the most out of them unlike AMD which is a MASSIVE win for team red and why I usually suggest ryzen for anyone who doesn't really enjoy tweaking.

I'd say not having stutters makes them better already, atleast for my use case, but, I never said it would be the fastest and I'm not interested in that, fps has long gone beyond what is even remotely needed (for me) and I'm after the smoothest experience.

I am intrigued as to where you are getting the idea that a stock, let alone a tuned 14900ks is 80% slower from though, not saying it's not in certain games, judtg nit aware of which those would be.
_I_ 17 Jun @ 11:40pm 
the first gens of ryzen were not bad, just not great
close to the intels core performance at the time, but with more cores

and again amd releasing ryzen made intel scramble to put more cores on cpus
< >
Showing 1-15 of 36 comments
Per page: 1530 50