patthefatrat: Hi folks. I am building a new system and I l already bought a Nividia GTX 960 with Witcher 3 which should be able to run the game on high settings with a reasonable framerate. The system will be a AMD one. The question is now: Should I go for an 8-core like the FX 8320 (8x 3.5 GHz) / FX 8350 (8x 4.0 GHz) or an older 4/6-core with more GHZ like FX 4350 (4x 4.2 GHz) or FX 6350 (6x 3.9 GHz). I read that the game uses 4 cores at most, more don't seem to deliver more FPS. But the gamecard says that the FX 8350 is recommended. Since I have no understanding of CPU architecture, could someone knowledgeable please enlighten me?
Thanks a lot.
I have an AMD FX8350 bought 2.5 years ago and with The Witcher 3 running with Firefox with 160 web pages open in the background and Chrome open with about 50 web pages, Steam client and Galaxy client both running, Hexchat, Mozilla Thunderbird, a dozen systray applets, Utorrent downloading Linux ISO images and a second monitor with Windows Resource Monitor running my total CPU usage while in-game varies from 15-20% tops at 2560x1600 resolution.
In short, the game doesn't use anywhere near as much CPU resources as the CPU recommendation states. With identical video hardware you shouldn't even notice a difference in-game with the 8350 versus 8320.
Another thing I should mention is that most games only use 2-4 cores effectively in a multi-core system at best. Even if they parallelize to 8 cores just because they're there, they tend to use a far lower percentage of each core's full computing resources. I have yet to see any game that both uses 8 cores and actually maxes out more than one or two of the cores. In most games that I have seen which can even use more than 2 cores, only one core is maxed out at 100% or so usage and any other cores the game uses are idling at 5-20% usage maximum anyway. The reason for this is that a lot of the work a game has to do is not inherently parallelizeable, so getting 8-way parallelization from any game engine to the point where it can fully max out every single core in a system is just unlikely at least with today's games.
What multiple cores does bring to the table, is a practical guarantee of sorts that a game will have a few cores to itself if it needs them and can actually utilize them effectively while the OS shoves all other background tasks etc. onto other cores.
Having said that, you'd most likely find a 4 core processor running at 4GHz to be faster in any given game than an 8 core processor running at 3.8GHz or lower. In this case, the GHz wins because the extra cores do not get utilized at all, or if they do they are just "token usage", in other words the OS will spread threads across to them but it isn't useful. If you have say 3 threads of a game running each using 5% max resources on one core, all three threads could run on one core quite effectively using 15% of the core, or the OS could push them each onto their own core. In either case they will run at the same speed and underutilize whatever core they're on. Additionally, threads spread across more cores instead of all running on one core could actually lower performance by causing cache thrashing - unless of course the application was written to be optimized for that not to be an issue.
In short though, for gaming up to 4 cores is more than adequate and CPU clock is the king. More than 4 cores just give smoother overall system performance with plenty of CPU resources to share with other apps. For the Witcher 3 specifically though, on the CPUs you referenced - the game barely even uses the CPU in my observation and the GPU becomes the bottleneck ultimately. I've got an AMD Radeon HD7850 and I get decent enough results with it even though it's below-spec, but it's definitely my limiting factor for sure.
If you've only got so much money to go around, save a bit of money on the CPU if you have to and pour the cash into a better GPU, you'll get a higher frame rate with more fancy pants graphics for sure.
Hope this helps.