Thread: The CPU Thread
Say I'm gaming and using 8 cores on my game, want multitask so also streaming in 4k. Suddenly maybe the E core can handle that with some ASIC but not sure.

Also who knows if future games will use more cores. What if nextgen consoles come with 16 cores and they are used by most developers? What are the E cores gonna do?

In addition I personally want to try some high thread count code, uniform cores work wonders as I can divide work into uniform chunks and have it run in parallel at about the same time. But what will I do with E cores? Either I have to do some threads with a small chunk of workload or just ignore the E cores.

Note that some researchers have gotten AI code to run competitive to gpus in cpus. And such code can likely eat threads for breakfast. AI is the future, and it's conceivable cpus still have room to play there.
I mean a Ps6 is at least half a decade away, so. But if a game used more than 8 cores, the E cores would at least be used after the 8 big cores are used. They're perfectly capable of helping in gaming, but they shouldn't be used until the main cores are saturated. As for when more than 8 will be needed...? Well it's anyone's guess but for the vaaaast majority of games 6 cores are enough and do you remember how long quad cores were enough? I think when more than 8 cores are beneficial for gaming, intel would probably add more big cores but not until then.

As for multicore/multitask workloads, I would say wait and see how 13900k compares with 7950x but also on the cheaper side of things, compare i5 13600k with 7600x... Both have 6 cores but the i5 has 8 E cores as well.

So y'know, on the lower end skus intel is just going to dominate for multitask stuff and it'll be interesting to see 13900k vs 7950x soon.
 
  • Like
Reactions: zebraStraw
So y'know, on the lower end skus intel is just going to dominate for multitask stuff and it'll be interesting to see 13900k vs 7950x soon.
It'll multitask well for some tasks, but if you are multitasking with heavy applications like 4k streaming + high end gaming I think it'll experience abysmal performance.

Also say high end games like Star Citizen are said to have over 100 threads and look at some of their comments
Theoretically, this system scales to consume as many threads as are reasonably made available. There are always diminishing returns, but the idea is that heavy tasks can be more efficiently juggled across multiple cores.

What are you gonna do with E cores when it comes to these next gen high end multiplayer games?

On the 7000 series Star Citizen can run at up to 200~fps in some areas. In many areas over 60fps.

On the intel cpus with E cores, the framerate is actually higher with E cores disabled yet only reaches 40s-50s fps. With E cores framerate is lower and there is microstutter.


Most gamers prefer multiplayer, and high end multiplayers like star citizen are the future for most gamers.
 
  • Like
Reactions: regawdless
Doesn't look like much of an upgrade over my 3900x if i'm gaming at 4K. Might just skip both CPU and GPU upgrades this year
 
It'll multitask well for some tasks, but if you are multitasking with heavy applications like 4k streaming + high end gaming I think it'll experience abysmal performance.

Also say high end games like Star Citizen are said to have over 100 threads and look at some of their comments


What are you gonna do with E cores when it comes to these next gen high end multiplayer games?

On the 7000 series Star Citizen can run at up to 200~fps in some areas. In many areas over 60fps.

On the intel cpus with E cores, the framerate is actually higher with E cores disabled yet only reaches 40s-50s fps. With E cores framerate is lower and there is microstutter.


Most gamers prefer multiplayer, and high end multiplayers like star citizen are the future for most gamers.

You can compare right now the 12600k vs 5600x and see that the intel chip wipes the floor with the ryzen 6 core in multitasking, and 12600k only has 4 e cores. You get 14 total cores on 13600k vs. 6 on ryzen 7600. There's nothing theoretical about it.

As for 13900k vs 7950x, nothing i can say but wait.

I have absolutely nothing good to say about Star citizen, and because it sounds like you may be a fan I really don't want to get into that lol. But that's a software scheduling problem.
 


Intel's first 6.0 GHz CPU coming 2023

With the disclosure of the next-gen Core Series platform, Intel is confirming it will launch its first desktop CPU offering 6.0 GHz frequency out of the box next year.

The new part, most likely called Core i9-13900KS will launch next year, and it will be limited in volume. The KS-series SKUs are pre-binned processors, offering the best power efficiency and stable operation at higher clocks. Last year, Intel launched its Core i9-12900KS model, which is the fastest SKU in Alder Lake-S series, offering 5.5 GHz with all cores out of the box.

It is not clear if the 6.0 GHz claim is for all cores or just a certain number of them. However, is undoubtedly only for the Performance (Raptor Cove) cores, not the remaining 16 Efficient cores.

The 12th Gen Core KS Special Edition launched at $739, a hefty $150 premium over i9-12900K. It is unclear how much more expensive could the 13900KS be, but it definitely won't be cheap.

One should expect the arrival of the 6.0 GHz desktop CPU shortly after AMD launches its Ryzen 7000X3D series with 3D V-Cache. Rumors suggest that could happen at CES 2023.

Intel-Core-i9-13900KS-6GHZ-HERO-BANNER-1800x663.jpg
 
I have absolutely nothing good to say about Star citizen, and because it sounds like you may be a fan I really don't want to get into that lol. But that's a software scheduling problem.
I don't know how it can be scheduling as it can use unlimited threads, only thing it could do is ignore the e cores. Which again will mean the 13600 will act as if it had no E cores compared to 7600 when it comes to star citizen and games like it. The only option is for the high end gaming app to avoid the e cores like the plague or suffer microstutter and lower framerates.
 
Last edited:


Disabling E cores seems to increase performance in most current games even in windows 11 at least for systems with 4 P cores
 
Last edited:
  • This tbh
Reactions: Bolivar


Disabling E cores seems to increase performance in most current games even in windows 11

It's been 10 months since alder lake launch. I don't think it's still the case. Hardware unboxed never mentions disabling e cores anymore. Seriously dood, it was the same with hyperthreading at first. Now every new desktop CPU not named celeron has it.

But that's what's happening in star citizen too, the game is using an E core instead of a P core ; wrong scheduling.

And when they say it can use 100 threads they mean the load meant for example 8 cores (don't know if SC can fully use that many or not) can be spread evenly across 100 threads i.e. instead of 10 cores pegged at 100% load you'll see 100 threads pegged at 10% load. No game out there afaik actually needs more than 8 cores.
 
  • Like
Reactions: Joe T.
This is on a recent video on the 12900k. 3 months ago


Some games do worse even in windows 11 12900k even though it has 8 p cores, others do better but look at this interesting comment on the video
Something I noticed with E-cores off is either you got no frametime spikes where E-cores ON did or they where smaller in size.E-cores off definitely helps with better frametime.I guess it's the price you get when games switch from P-cores/E-cores and vice versa. Personally I'd turn them off for a gaming ring coz frametime spikes piss me off.
 
  • Brain
Reactions: regawdless
This is on a recent video on the 12900k. 3 months ago


Some games do worse even in windows 11 12900k even though it has 8 p cores, others do better but look at this interesting comment on the video

Only horizon benefited at all from disabling e cores. The rest were either the same or better with them on. I was surprised to see far cry 6 do better with them on, funny because I mentioned earlier that was a problem game.

I certainly wouldn't disable them. Bang4buck says that as well at the end of the video, so this was a strange choice to try to prove your point.

And by the time I build a 14th gen rig I don't expect it to be an issue at all on windows 11.
 
Something else to keep in mind is that more issues are going to appear on games with no fps cap, whereas otherwise with a cap if for some reason the scheduling is wrong and an e core is used, it might still get the job done and hit your fps limit. So just looking at *how fast can we go* benchmarks may not indicate your actual experience.

Frametimes could be more of an issue, but at some point E cores are going to be so strong that they could do great for older games where the scheduling isn't right. Example 14th gen E cores get a whole new architecture whereas the P cores just get tweaked. 15th gen will see an overhaul on P cores.

And for games moving forward do we really think Intel won't make sure these cores are properly supported?

I have a lot of my own money riding on Intel, so if I ever got a whiff of them fucking up I'd be here bitching about it. E cores aren't it.

New server chip delays? Yah I'm not happy.
 
And for games moving forward do we really think Intel won't make sure these cores are properly supported?

I have a lot of my own money riding on Intel, so if I ever got a whiff of them fucking up I'd be here bitching about it. E cores aren't it.
In the 4 P core 8 E core vs 6 P core, 2 additional P cores outdid the performance of 8 E cores.

A regular core can handle multiple background tasks at the same time, you don't need a dedicated weak core for a background task.

I hope E cores frametime issues can be solved, but I'm thinking you'd basically have to label them as exclusively used for certain tasks and ignore use for threads requiring high performance. I still simply do not like them, they are slower in clockrate they have significantly less performance.

We will see how well they age in the future.
 
In the 4 P core 8 E core vs 6 P core, 2 additional P cores outdid the performance of 8 E cores.
Huh? 2 P cores are not better than 8 E cores.

Intel can either use 8 E cores or *2* P cores, and use the same amount of die space/energy. Intel gets more performance with adding E cores vs. adding more P cores, hence whey they don't add more than 8P cores.

Intel-Alder-Lake-Hybrid-Design-P-Core-E-Core-Performance-_1.png


pic_disp.php


I think we've covered the whole scheduling issue, though the video you linked shared my sentiments about how it's not really something to worry about but since we're talking about it -

On the note of E cores not helping gaming currently, and i'm obviously no engineer, but I have been wondering if Intel could launch an 8 core gaming chip without E cores, and instead use that die space for their own version of stacked l3 cache. Now THAT would be much better than E cores, for a gaming only chip.
 
Huh? 2 P cores are not better than 8 E cores.

Intel can either use 8 E cores or *2* P cores, and use the same amount of die space/energy. Intel gets more performance with adding E cores vs. adding more P cores, hence whey they don't add more than 8P cores.

Intel-Alder-Lake-Hybrid-Design-P-Core-E-Core-Performance-_1.png


pic_disp.php


I think we've covered the whole scheduling issue, though the video you linked shared my sentiments about how it's not really something to worry about but since we're talking about it -

On the note of E cores not helping gaming currently, and i'm obviously no engineer, but I have been wondering if Intel could launch an 8 core gaming chip without E cores, and instead use that die space for their own version of stacked l3 cache. Now THAT would be much better than E cores, for a gaming only chip.

Well I was basing my point on the hardware unboxed video, were 2 P cores outdid 8 e cores when it came to gaming performance. In low core count systems disabling e cores has higher performance, and adding 2 p cores also substantially increases performance.

For some applications the e cores might have an advantage, but I wouldn't trust intel's figures to be accurate.

In that Puget comparison you can see that 4 P cores vs 2 P cores with 8 E cores, only gives about 25~% advantage to the E cores, in cherry picked cinebench about the best it'll get. For almost any performance demanding task that you parallelize a strong core is preferable to multiple weaker cores, there are some ridiculously parallelizable apps which may be an exception, but the exception they are. In fact if you could somehow increase singlethread performance to ridiculous numbers a single core would be preferable to multicore systems.
 



Intel-Core-i9-13900KS-6GHZ-HERO-BANNER-1800x663.jpg
They have a new gen already? Damn the year goes fast. I only got my 11700K PC 16 months ago. It's too fast for me.
 

Intel teased that their 13th Gen Core series can deliver much higher frequencies than predecessors. Despite featuring the same process technology, Intel made considerable improvements to the design, which also gave a lot of headroom for higher clocks out of the box and even for further tweaking.

During the event, overclocker Allen 'Splave' Golibersuch managed to overclock the flagship Core i9-13900K processor to 8.2 GHz under liquid nitrogen (LN2). This is the high-end 24-core processor from Raptor Lake series that is yet to be released later this month. This frequency was achieved with a single core, while other cores were kept at 5.7 to 6.3 GHz.

Such frequency is much higher than Alder Lake i9-12900K record of 7.6 GHz and even higher than AMD's newest Ryzen 9 7950X CPU which managed to reach 7.2 GHz with one core.

Despite this achievement, neither AMD nor Intel CPUs pose threat to the best CPU for overclocking, which is AMD FX 8370 still occupying the first place in official HWBOT ranking at 8.7 GHz.

Intel Core i9-13900K has not yet been released. This CPU launches on October 20th, and it will cost at least $589.

Source via Tom's Hardware:

INTEL-RAPTOR-LAKE-8.2-GHZ-OC-2.jpg


 

Intel teased that their 13th Gen Core series can deliver much higher frequencies than predecessors. Despite featuring the same process technology, Intel made considerable improvements to the design, which also gave a lot of headroom for higher clocks out of the box and even for further tweaking.

During the event, overclocker Allen 'Splave' Golibersuch managed to overclock the flagship Core i9-13900K processor to 8.2 GHz under liquid nitrogen (LN2). This is the high-end 24-core processor from Raptor Lake series that is yet to be released later this month. This frequency was achieved with a single core, while other cores were kept at 5.7 to 6.3 GHz.

Such frequency is much higher than Alder Lake i9-12900K record of 7.6 GHz and even higher than AMD's newest Ryzen 9 7950X CPU which managed to reach 7.2 GHz with one core.

Despite this achievement, neither AMD nor Intel CPUs pose threat to the best CPU for overclocking, which is AMD FX 8370 still occupying the first place in official HWBOT ranking at 8.7 GHz.

Intel Core i9-13900K has not yet been released. This CPU launches on October 20th, and it will cost at least $589.

Source via Tom's Hardware:

INTEL-RAPTOR-LAKE-8.2-GHZ-OC-2.jpg



I wonder how intel will do for power efficiency this generation. It seems the 12th gen was just using more power to get higher clocks by brute forcing. I hope that the newer cards try to be a bit more power efficient, at least the i5s anyway.
 
  • Brain
Reactions: Grisham
I mean... Zen 3 is so good, why would anyone buy Zen 4 right now, especially with the absurd DDR5 prices?
 
  • Brain
Reactions: Mickmrly
I'm in no hurry, but I plan on focusing my next build around the Ryzen 9 5950X

04WhihjI4VL8APGcfo39ny1-2..v1608222496.jpg


Still waiting for the pc landscape to settle, maybe next year

Will likely turn my current rig into some sort of htpc/emulation box
 
I'm in no hurry, but I plan on focusing my next build around the Ryzen 9 5950X

04WhihjI4VL8APGcfo39ny1-2..v1608222496.jpg


Still waiting for the pc landscape to settle, maybe next year

Will likely turn my current rig into some sort of htpc/emulation box
Any reason you're going for the 5950x? You'll likely be able to get a 13600k for cheaper and it outperforms it.

 
Any reason you're going for the 5950x? You'll likely be able to get a 13600k for cheaper and it outperforms it.


Oh really? Wtf

Just ignorance, I guess lol

Remember picking this cpu years back when I did video editing and wanted a build to handle a 4k workload

Thanks, I'll do research
 
  • Like
Reactions: zebraStraw
Oh really? Wtf

Just ignorance, I guess lol

Remember picking this cpu years back when I did video editing and wanted a build to handle a 4k workload

Thanks, I'll do research
Stuff gets stronger for cheaper every year my dude. Except for Nvidia I guess.
 
Apart from games are there really any big improvements to be had from stronger CPUs? I use my PC for some 3D animation stuff and while the CPU helps with processing, the GPU handles most of the grunt work.
 
Apart from games are there really any big improvements to be had from stronger CPUs? I use my PC for some 3D animation stuff and while the CPU helps with processing, the GPU handles most of the grunt work.
Encoding, rendering, compression/uncompression, compiling.

We are getting to a point where hardware is outpacing games. However: the 4090 is cpu limited below 4k. Previously you would generally only be CPU limited in 1080p and below (aside from poorly written games, heavily modded games, etc), but now with a more powerful CPU to go with these GPUs 1440p/240 isn't a pipe dream anymore.

But if you're just playing games and don't own a 4090 it doesn't really matter. Pretty much anything from 10600k or Ryzen 5000 and up will be fine for 1440p+ in most games.

I expect all of this to change as we get UE5 games.
 
  • Brain
Reactions: Mickmrly
I'd still wait for the AMD 3D chips next year. My only other consideration is which CPU is best for emulation. Still I can't see much benefit going from my 5800x when I game at 4K.
 
  • Like
Reactions: XOR
I'd still wait for the AMD 3D chips next year. My only other consideration is which CPU is best for emulation. Still I can't see much benefit going from my 5800x when I game at 4K.
Emulation isn't too bad even with laptop CPUs these days. My laptop I got this year with a 3050 and a 11400H I think, can play gamecube games pretty well. I don't know any systems further than that but having a laptop CPU actually being able to handle such emulation at 60fps is so cool to me.
 
  • 100%
Reactions: Hostile_18
Apart from games are there really any big improvements to be had from stronger CPUs? I use my PC for some 3D animation stuff and while the CPU helps with processing, the GPU handles most of the grunt work.
I believe Battlefield 2042 benefits from having more than four cores. I upgraded from a 4790k to a 12700k and even with a GTX 1080 it was a night and day difference in terms of high and consistent framerates. Admittedly, it's not a killer app, but we may expect more games to perform similarly as the generation matures.
 
  • Brain
Reactions: Mickmrly