Thread: The CPU Thread
As an owner of both a 7700X and a 7900X - I assure you can't go wrong with either

6Tr5ezkac.gif
 
As an owner of both a 7700X and a 7900X - I assure you can't go wrong with either

6Tr5ezkac.gif

I've just been looking at 4k gaming benchmarks and the difference from my 7600x is like 1-3 fps... in either direction lol. I do no productivity, so its going to be X3D or none at all. Kind of blew my mind actually lol.
 
More I look into this more I think I might not even need a new CPU. The 7600x matches or *just* exceeds the 7800-7900-7950x for 4K gaming (because all cores are on one CCD).

The only option for me is 7800X 3D or 7900X3D (because of the other ccd without vcache but higher clocks).

If AMD are going on about a 10-20% uplift on their slides, it doesn't really make sense to upgrade for hundreds. It's crazy to think about but the 7600x was only ever meant to be a temporary stop gap till now. It never occurred to me Gaming on a 6 core CPU would make virtually no difference compared to the "higher end" models.

Obviously this is referring to 4K Gaming only and more core CPUs make total sense for productivity workloads. 8 cores also make sense for those that play games and do something else (such as Stream).
 
More I look into this more I think I might not even need a new CPU. The 7600x matches or *just* exceeds the 7800-7900-7950x for 4K gaming (because all cores are on one CCD).

The only option for me is 7800X 3D or 7900X3D (because of the other ccd without vcache but higher clocks).

If AMD are going on about a 10-20% uplift on their slides, it doesn't really make sense to upgrade for hundreds. It's crazy to think about but the 7600x was only ever meant to be a temporary stop gap till now. It never occurred to me Gaming on a 6 core CPU would make virtually no difference compared to the "higher end" models.

Obviously this is referring to 4K Gaming only and more core CPUs make total sense for productivity workloads. 8 cores also make sense for those that play games and do something else (such as Stream).
That's always been the case. It's not core count, it's total power, which is strength of the core x it's number. Most games can only address or need a few threads.

But I guarantee you'll get an x3d lol.
 
Ok people. For around 600 Euro, what would be the best CPU/MOBO/RAM combo? 32GB DDR5 Ram.

What would you recommend?
 
Ok people. For around 600 Euro, what would be the best CPU/MOBO/RAM combo? 32GB DDR5 Ram.

What would you recommend?
That's a tight budget for a DDR5 build. I think the only CPU that would fit in that is the 12400/13100 and you probably don't want that. So I'd add a little and go with a Ryzen 7600 for the AM5 upgradeability. Then whatever cheap mobo and 6000 mhz RAM you can find.
 
  • Brain
Reactions: regawdless
That's a tight budget for a DDR5 build. I think the only CPU that would fit in that is the 12400/13100 and you probably don't want that. So I'd add a little and go with a Ryzen 7600 for the AM5 upgradeability. Then whatever cheap mobo and 6000 mhz RAM you can find.

Yeah I'm just seeing that DDR5 MOBOs are more expensive than I thought. 5200hz 32GB DDR 5 can be bought for 135 Euro on Amazon, so it's not that expensive. CPUs like the 12600k are around 300 Euro. But the MOBOS cost as much as the CPU. Wouldn't want to invest more than 650 Euro tbh. Mhm.
 
Yeah I'm just seeing that DDR5 MOBOs are more expensive than I thought. 5200hz 32GB DDR 5 can be bought for 135 Euro on Amazon, so it's not that expensive. CPUs like the 12600k are around 300 Euro. But the MOBOS cost as much as the CPU. Wouldn't want to invest more than 650 Euro tbh. Mhm.
Are you against buying DDR4 instead as that would save you quite a bit on both RAM and motherboard?
 
Are you against buying DDR4 instead as that would save you quite a bit on both RAM and motherboard?

How big of a performance hit would that mean? I really don't have an idea how big the gains of DDR5 are.

Just checked the prices... yeah I could get a mainboard and 32gb DDR4 for 200 bucks, leaving 400 - 450 for a CPU.
 
Last edited:
How big of a performance hit would that mean? I really don't have an idea how big the gains of DDR5 are.

Just checked the prices... yeah I could get a mainboard an 32gb DDR4 for 200 bucks, leaving 400 - 450 for a CPU.
DDR4 vs DDR5 difference is minor in most games at the moment, although the benefits will likely increase with time. Since your budget is limited I would err on the side of using the savings for a better CPU, especially if you intend to keep the same setup for 5+ years.
 
  • Like
Reactions: regawdless
DDR4 vs DDR5 difference is minor in most games at the moment, although the benefits will likely increase with time. Since your budget is limited I would err on the side of using the savings for a better CPU, especially if you intend to keep the same setup for 5+ years.

Just read a bunch of articles and comparisons. At 1080p, DDR5 6000mhz can be significantly faster than slow DDR4 RAM especially in games like Spiderman.

But going with fast DDR4, like 4800mhz, reduces that to a couple percent. At 1080p, mind you. In higher resolutions, it's even less. So not worth it at all tbh.

I'll go for the best CPU under 400, then fast DDR4 RAM with a Mobo that supports it
 
Just read a bunch of articles and comparisons. At 1080p, DDR5 6000mhz can be significantly faster than slow DDR4 RAM especially in games like Spiderman.

But going with fast DDR4, like 4800mhz, reduces that to a couple percent. At 1080p, mind you. In higher resolutions, it's even less. So not worth it at all tbh.

I'll go for the best CPU under 400, then fast DDR4 RAM with a Mobo that supports it
Don't overthink it. I'm running a 10 year old CPU and a 1660 Super and run everything just fine on a 1080p, and happily run racing games on triple 1080ps or VR (Rift at 90Hz). Tbh any old machine will do a perfectly decent job.
 
  • Star
Reactions: Mickmrly
I think it would be a big mistake to invest in AM4 right now. The platform is end of life.

Check out these prices near you;

Asus Rog Strix e-e WiFi Mobo (or e-f)
6000mhz Kingston Fury Renegade Ram.
7600x or 7600 (as good as the best CPU for games).

Then continued support till at least 2026. Next upgrade will be a drop in CPU upgrade.
 
Don't overthink it. I'm running a 10 year old CPU and a 1660 Super and run everything just fine on a 1080p, and happily run racing games on triple 1080ps or VR (Rift at 90Hz). Tbh any old machine will do a perfectly decent job.

I am the resident raytracing whore. That's not how I roll lol. I have a 8700k and a 3080 and it's not enough for me.
 
I am the resident raytracing whore. That's not how I roll lol. I have a 8700k and a 3080 and it's not enough for me.

Is it 1440p you game at? Because at that resolution a 4080 is not that far off a 4090 (-13-14%?... if that). Only at 4k you really notice the +30%+ uplift.

I know you'd have more fun with a 4080. Do it, do ittttt. ×2 upgrade over your 3080. Imagine Returnal and all that Ray tracing goodness.
 
Is it 1440p you game at? Because at that resolution a 4080 is not that far off a 4090 (-13-14%?... if that). Only at 4k you really notice the +30%+ uplift.

I know you'd have more fun with a 4080. Do it, do ittttt. ×2 upgrade over your 3080. Imagine Returnal and all that Ray tracing goodness.

Playing at 4k more and more often because of my OLED. 4080 is way too expensive, the cheapest one here comes in at 1.350 Euro. I'm not doing that shit.
 
  • Brain
Reactions: Hostile_18
I wonder how much it costs to add in to the BOM cost. It would be interesting to have a high performance card that can't do RT, if it was significantly cheaper. The space on the die could be used for other things as well.
 
Just read a bunch of articles and comparisons. At 1080p, DDR5 6000mhz can be significantly faster than slow DDR4 RAM especially in games like Spiderman.

But going with fast DDR4, like 4800mhz, reduces that to a couple percent. At 1080p, mind you. In higher resolutions, it's even less. So not worth it at all tbh.

I'll go for the best CPU under 400, then fast DDR4 RAM with a Mobo that supports it
Don't know where you saw that but ddr5 can help a ton ; check hardware unboxed Spider-Man vids.

IMO dude please wait for Intel meteor lake that way you can upgrade to Intel 15th (Jim Keller's) royal core design if you like.

Speaking from experience, even this 5800x3d hits bottleneck on ray tracing with this 4070 ti ; we need even better CPUs than what we have now. Personally I will buy the meteor lake i7 k sku later this year.
 
  • Strength
Reactions: regawdless
Don't know where you saw that but ddr5 can help a ton ; check hardware unboxed Spider-Man vids.

IMO dude please wait for Intel meteor lake that way you can upgrade to Intel 15th (Jim Keller's) royal core design if you like.

Speaking from experience, even this 5800x3d hits bottleneck on ray tracing with this 4070 ti ; we need even better CPUs than what we have now. Personally I will buy the meteor lake i7 k sku later this year.

I really might just do that. Even if it's until Q4.
 
  • Love
Reactions: Chozofication
I wish I was as rich as you. I just don't get the hype with raytracing, I really don't.
As a graphics whore that always favors turning it on, I completely agree. It's overhyped even when it's well-implemented.

There are examples of very bad RT implementations where I agree, it's dumb and costs too much performance.

When competent devs use it well, it's easily the setting that makes the biggest difference. For me, in games where multiple RT effects come together and RTGI is in there as well, it's transformative. While playing, it all seems believable and coherent, the lighting is especially important because no other rendering tech can do indirect lighting well.

Like in this comparison, the scenes go from wrongly lit and flat to realistic and grounded. Now take any other setting like texture quality or even resolution and compare them - the difference won't be that significant.


Or just take Fortnite with and without Lumen:

fortnite-282041-6664087.jpg


If this isn't significant and not enough of a difference, I guess we have very different concepts of reality :D

And that's why I need a new CPU and more RAM, because my CPU struggles with some RT games. (and I also need a new GPU, but the new ones are too expensive for me).
 
  • Like
Reactions: Allnamestakenlol
If this isn't significant and not enough of a difference, I guess we have very different concepts of reality :D

Video games are meant to be fun, interactive experiences. How much is raytracing improving that experience?

There's also the question of whether or not video games need to be more realistic. Some do, some don't, and even for those that do I'd argue that most of them pushing towards realism should prioritize the physics and AI side of the equation rather than visuals.

I can dream up a number of scenarios in games, both single and multiplayer, where real-time RT can become a factor that leads to memorable moments - I've got scenes from games like Goldeneye/Perfect Dark, Zelda and Metal Gear Solid 2 in mind, for starters - but from all I've seen it's not being used in that fashion. It's being used more like a fresh coat of paint, an added layer of visual fidelity with little to no contribution to the interactive experience.

Just so there's no misunderstanding, real-time raytracing is great and I want to see it continue to evolve. However, I also want to see it used sparingly and where it makes sense rather than becoming a cheap and ultimately meaningless marketing gimmick to win over reviewers and the easily-influenced masses.

I don't see its use in Fortnite, for example, as a net positive unless I'm looking at it from the perspective of Intel, AMD and Nvidia - it'll help them push a few extra casual gamers in the low budget or outdated hardware camp to seek upgrades with RT in mind.

My core problem with the video game industry and the market it serves today (ESG aside because that affects the world at large) is the heavy slant towards graphics vs gameplay, it's been tilting more and more in the same direction over time. RT is adding to that problem, but when I start seeing some strong examples of it deciding the outcome of an online firefight or producing entertaining, dynamic single-player experiences my opinion will turn the other way.
 
3700x paired with 3080fe here. Both bought on release.
So that is mid 2019 cpu. I've got x570 gigabyte elite mobo with it and it was trash for first few months... I regretted going ryzen a lot after 2500k. Long boot times and issues.
But it's fine after many uefi and agesa updates.
Generally I was to early with upgrade. 3700x turned out just ok. X570 very anoying and pci 4.0 nvme drives were just coming out at high price so I have "just" 3.0 drives.

Is it time to upgrade? I play 4k. Usually with Dlss quality.
I wouldn't see any fps difference playing at this res, correct ?!
 
Video games are meant to be fun, interactive experiences. How much is raytracing improving that experience?

There's also the question of whether or not video games need to be more realistic. Some do, some don't, and even for those that do I'd argue that most of them pushing towards realism should prioritize the physics and AI side of the equation rather than visuals.

I can dream up a number of scenarios in games, both single and multiplayer, where real-time RT can become a factor that leads to memorable moments - I've got scenes from games like Goldeneye/Perfect Dark, Zelda and Metal Gear Solid 2 in mind, for starters - but from all I've seen it's not being used in that fashion. It's being used more like a fresh coat of paint, an added layer of visual fidelity with little to no contribution to the interactive experience.

Just so there's no misunderstanding, real-time raytracing is great and I want to see it continue to evolve. However, I also want to see it used sparingly and where it makes sense rather than becoming a cheap and ultimately meaningless marketing gimmick to win over reviewers and the easily-influenced masses.

I don't see its use in Fortnite, for example, as a net positive unless I'm looking at it from the perspective of Intel, AMD and Nvidia - it'll help them push a few extra casual gamers in the low budget or outdated hardware camp to seek upgrades with RT in mind.

My core problem with the video game industry and the market it serves today (ESG aside because that affects the world at large) is the heavy slant towards graphics vs gameplay, it's been tilting more and more in the same direction over time. RT is adding to that problem, but when I start seeing some strong examples of it deciding the outcome of an online firefight or producing entertaining, dynamic single-player experiences my opinion will turn the other way.
Just to add to this...

One of the most successful games in the last years is Minecraft. Nobody would ever say that it has amazing graphics, yet they decided to add RT to it. Yes, it makes a significant difference visually, but does it really add anything to the game? The answer is no. Better graphics are nice but they generally don't make or break the game. If that was the case, many of the classics would be shitty games today, and they aren't. And if they were shitty in the past, RT is not going to make them better games today

RT will only become truly important when its effects become part of the gameplay. And that is definitely something that is possible to implement that would otherwise be either extremely taxing or impossible. Something like Alan Wake would be able to be a lot more interactive and dynamic since RT would provide more accurate lighting.
 
I'm somewhere in the middle on the subject. If you've got a high end card it's a nice feature, anything less it's probably what you need to turn off first to save fps.

Nothing beats OLED as the single biggest thing you can do to improve graphics IMO.
 
  • Like
Reactions: rofif and Ascend
But going back on-topic... I can't help but think that right now is not a very good time to upgrade your system. GPU prices are wack and motherboard prices are even worse. DDR5 is not a good enough upgrade over DDR4 yet, and DDR4 is nearing its end. This is really a "meh" time in the PC space imo.
 
Ok so waiting for end of the year to upgrade my CPU. But in the meantime, I'll at least upgrade my current 16GB 2800mhz RAM to 32GB 3200mhz. I'm doing more and more video editing and games are getting hungrier, so I think it makes some sense. Not that expensive, a hundred bucks and I'll sell my old RAM as well.