mak_kawa wrote: ↑September 7th, 2020, 9:42 am
Hi Yincognito
Oh really...?! I didn't know that process% depends on CPU frequency... shame.
Yeah, it's really logical, if you think about it: 5% of 100 is less than 5% of 1000 (just an example), and if you want to compare things, you need to use (or convert to) the same reference specs, otherwise the comparison leads to wrong conclusions. There's a reason why all benchmarks have the computer specs presented as well, so that folks can properly estimate things on their own (and different) configurations.
It's easy to see this if you force, say, a lower frequency in the Control Panel's Power Settings (assuming a changeable frequency is allowed in BIOS). If I set my frequnecy to 50% of its max, the Rainmeter CPU usage will double accordingly - by the way, doing this allows one to detect smaller variations in CPU usage that might not be that obvious on a higher frequency.
Ah... now I know it's simple arithmetic, not mathematics...
So, when I updated my CPU to new generation, eg. 2.9-4.3 MHz i5 10400, CPU freq. is x1.8 - 1.2 for i7 2600, which means CPU usage% decreases to x0.5 - 0.8... oh not bad.
mak_kawa wrote: ↑September 7th, 2020, 10:51 am
Hi Yincognito
Ah... now I know it's simple arithmetic, not mathematics...
So, when I updated my CPU to new generation, eg. 2.9-4.3 MHz i5 10400, CPU freq. is x1.8 - 1.2 for i7 2600, which means CPU usage% decreases to x0.5 - 0.8... oh not bad.
Yep, so basically the better the CPU or the bigger its frequency, the less CPU usage you'll have for the same process. This is of course influenced by the GPU specs as well, since the GPU sometimes takes from the burden of the CPU, but in general terms, yes, this is true.