Undervolting, CPU Lite Load and Input Lag

TerminatorUK

New member
PRIVATE E-2
Joined
Dec 28, 2013
Messages
7
Hey folks,

I hope you are all well.

I'm really not sure where to start with this - I've been an overclocker for a number of years but I'm new to undervolting so I'm really after a starter-for-ten. In every overclocking case I can remember (other than the obvious pushing too far and crashing!), it always resulted in a net positive performance improvement.

However, since upgrading my 8700k and dipping my toes in with an 5800X3D and then recently changing to a 13600k/MSI Z690 Gaming Edge WiFi DDR4 recently I've understood that these CPUs turn the traditional equation upside down. They are generally already tuned for near maximum performance and temperature limits throttle clocks down to prevent the system from crashing.

As it seems overclocking has very little improvement with these CPUs, it seems that going the other way, undervolting (to get the same performance but for less power / heat expenditure) is the way to go.

With some initial investigation I've been successful in bringing down the power consumption and heat but I've been finding it has been at the expense of input lag (something I am very sensitive to and a real #1 bugbear of mine).

The figures are too tempting to simply ignore, however. I was monitoring a match on Battlefield 2042 last night, when the 13600k was on 'auto' on CPU lite load (which I think was Lite Load level 12) and it was pulling around 150w. When changing to CPU Lite Load level 3, this came down to approximately 95w.

Temperatures followed a similar positive result, dropping from around 80oC to mid 60's. In synthetic tests / stress tests (e.g. Prime95), I noticed this was the difference between the CPU throttling to keeping below 90oC and no throttling occurring on

However, I cannot ignore the fact that in both the case of the 5800X3D (undervolted with a simple vcore offset voltage) and the 13600k (reducing CPU Lite Load to level 3), however, there was a tangible loss of 'smoothness' and mouse responsiveness felt more laboured and sluggish.

To ensure it wasn't placebo, I even went to the extremes of setting up a standard test in CS:GO (the waiting room of the fps test level), a 1000fps camera and 'reactive' light on my Razer viper mouse to count the frames difference (not an ideal test but gives something to go off - generally the muzzle flash appears on screen before the mouse light activates).

My notes are a bit scrappy (and I'm missing the 13600k stock tests) but included I've included below for completeness.

The TL:DR version - the more aggressive the undervolt, the more input lag presents itself. In addition, the stronger the undervolt, the more inconsistent the results were (i.e. a much greater mix of the muzzle flash either appearing before or after the mouse light.

Finally, also worth mentioning (noting the lack of 13600k stock result currently) is neither system has been quite as responsive as my 8700k @ 5ghz was input lag wise.

5800x3d Stock:

Muzzle flash appears...

- 1 frame before light
- 26 frames before light
- 16 frames before light
- 26 frames before light
- 16 frames before light
- 38 frames before light
- 8 frames before light
+ 6 frames AFTER light
- 10 frames before light
- 22 frames before light

Total = 157 frames before light
Average = 15.7 frames before light



-0.200v offset + Kombo Strike 3:

Muzzle flash appears....

- 35 frames before light
- 11 frames before light
+ 12 frames AFTER light
- 3 frames before light
- 9 frames before light
- 31 frames before light
- 34 frames before light
+ 15 frames AFTER light
- 18 frames before light
+ 18 frames AFTER light

Total = 96 frames before light
Average = 9.6 frames before light
15.7 frames (stock) - 9.6 = 6.1

TOTAL EXTRA LAG = +6.1ms (slower)

-0.100v offset + Kombo Strike 3:

Muzzle flash appears....

- 1 frame before light
- 5 frames before light
- 7 frames before light
+ 2 frames AFTER light
- 3 frames before light
- 4 frames before light
- 9 frames before light
- 5 frames before light
- 9 frames before light
- 2 frames before light

Total = 43 frames before light
Average = 4.3 frames before light
15.7 frames (stock) - 4.3 = 11.4

TOTAL EXTRA LAG = +11.4ms (slower)


8700k reference tests

- 6 frames before light
- 29 frames before light
- 36 frames before light
- 5 frames before light
- 13 frames before light
- 22 frames before light
- 42 frames before light
- 24 frames before light
- 26 frames before light
- 39 frames before light

Total = 242 frames before light
Average = 24.2 frames before light
15.7 (5800x3d) - 24.2 frames = 8.5 ms faster

TOTAL EXTRA LAG = -8.5ms (faster)


13600k - Mode3

+ 4 frames AFTER light
- 22 frames before light
+ 10 frames AFTER light
+ 3 frames AFTER light
- 35 frames before light
- 29 frames before light
- 10 frames before light
- 8 frames before light
- 34 frames before light
- 15 frames before light

Total = 136 frames before light
Average = 13.6 frames before light
15.7 (5800x3d) - 13.6 = +2.6ms (slower)

TOTAL EXTRA LAG = +2.6ms (slower)
 

citay

Pro
SERGEANT
Joined
Oct 12, 2016
Messages
14,535
What's all your other hardware and your BIOS version?
 

TerminatorUK

New member
PRIVATE E-2
Joined
Dec 28, 2013
Messages
7
Hey - thanks very much for your reply and any assistance you can offer.

Current setup as follows:

CPU: Intel Core i5 13600k
Motherboard: MSI Z690 Gaming Edge WiFi DDR4 (BIOS 7D31v1A 2023-01-16)
RAM: G-Skill Trident Z B-die DDR4 32GB (4 x 8GB) 3600 C16-16-16-35 @ 1.35V
GPU: MSI GeForce RTX 3090 24GB Gaming X Trio (also MSI GeForce GTX 1080 Ti 11GB Gaming X)
PSU: Corsair HX1200 1200w

AMD setup was as follows:

CPU: AMD Ryzen 7 5800X3D
Motherboard: MSI MPG B550 Gaming Carbon (BIOS 7C90v1B 2022-08-12)
RAM: Corsair Vengeance RGB PRO DDR4 32 GB (4 x 8 GB) 3200 C16
GPU: MSI GeForce RTX 3090 24GB Gaming X Trio (also MSI GeForce GTX 1080 Ti 11GB Gaming X)
PSU: Corsair HX1200 1200w

Old intel setup was:

CPU: Intel Core i7 8700k @ 5ghz
Motherboard: MSI Z370 Gaming M5 (BIOS 7B58v1A 2020-06-11)
RAM: Corsair Vengeance RGB PRO DDR4 32 GB (4 x 8 GB) DDR4 3200 C16
GPU: MSI GeForce GTX 1080 Ti 11GB Gaming X
PSU: Corsair HX1200 1200w

Note: during the upgrades, I was also using an MSI GeForce GTX 1080Ti Gaming X which experienced the same issues with input lag increasing as I performed more undervolting with the CPU.
 

citay

Pro
SERGEANT
Joined
Oct 12, 2016
Messages
14,535
Ok, all NVIDIA. I have a 3050 and i had problems with high latency, check with https://www.resplendence.com/latencymon

The problems are due to a faulty NVIDIA driver at the moment (which has existed for at least two driver versions):

nvidia.png



This increased latency is not only evident in LatencyMon, it had caused me also two BSODs (related to NVIDIA driver) and audio crackling issues when playing sound with foobar2000 for example. I took out my 3050 for a test and switched to integrated Intel graphics - problems are gone and performance is higher! Except in games of course, performance is abysmal there now, haha. But i will keep it like this until they have fixed this in the NVIDIA drivers. Or i might get a 4060 in a couple months, if it's any good (i already know 4050 won't be good, only 6 GB of VRAM).
 

P.D&n

Aim big from the start and roll out!
Joined
Nov 20, 2022
Messages
285
i already know 4050 won't be good, only 6 GB of VRAM
To play 4K@120fps and 8K@60fps you will need at least 16Gb of Vram. Future games will add more IA to there games so the demand of processing cores will also increase and they will also start doing it in the cloud too... to lower the demand.

experienced the same issues with input lag increasing as I performed more undervolting with the CPU.
True, gaming is one of the most demanding, and need a lot of power. Therfore you don't need to tune with CPU lite, I think. Did you try with a positve offset? There are many ways to tune an msi mobo. You vould go for high boosts or you could go for higher base clock. You can even turn off the e-cores. Take a look at this man. This is a tuning specially for gaming using a higer clock speed and more power.


And if you have a watercooling you still need to cool the spd's arround the cpu. How more expensive your motherboard is, how more have of those and how more boost you get. Mine, sam as yours, has 16 of those blocks with a total of 90amps, means 5amp each at high speed. I think they will get hot too. I didn't thought about like that, when i was building it. Ony add some fans for my system and futur gpu. Now I am start to understand why. I think I need to feel it with my manual tuning at 5,7Ghz with no power limits. Still watercooling and the 13900k would be a better choose.
 

citay

Pro
SERGEANT
Joined
Oct 12, 2016
Messages
14,535
To play 4K@120fps and 8K@60fps you will need at least 16Gb of Vram. Future games will add more IA to there games so the demand of processing cores will also increase and they will also start doing it in the cloud too... to lower the demand.
I want reasonable framerates at WQHD, and i don't play the most recent AAA games, mostly older stuff. For this, the 4060 might do. My monitor does WQHD at 75 Hz.


And if you have a watercooling you still need to cool the spd's arround the cpu. How more expensive your motherboard is, how more have of those and how more boost you get. Mine, sam as yours, has 16 of those blocks with a total of 90amps, means 5amp each at high speed. I think they will get hot too.
This is a bit painful to read, sorry. Please don't explain technical things when you haven't understood them properly yet. You say "spd" when you mean the VRM MOSFETs or powerstages. You take their theoretical rating of 90A each (not combined!), then talk about "5amp each at high speed" (?)... what is this all supposed to mean? 5A x 16 MOSFETs at ~1V = 80W? And you don't take into account their efficiency rating, or that they're not all for VCore?

The 90A are for an individual MOSFET/DrMOS/powerstage, when tested in isolation, this is the maximum rating. On a motherboard, because of a tightly packed VRM section, you can never draw anywhere close to that current from the MOSFET, it would overheat way before that. Also, good powerstages have around high 80% to low 90% efficiency, so the switching losses are relatively low. With a 13th gen Rocket Lake (13600K/13700K/13900K), the only worry is the enormous power consumption and thus heat production of the CPU. 13700K and especially 13900K will run at the thermal limit with most coolers at fully multithreaded load, so the VRM temperatures are the least of your worries when the CPU is already throttling hard.
 

P.D&n

Aim big from the start and roll out!
Joined
Nov 20, 2022
Messages
285
Thanks for explaining the VRM's, that's what I meant. Didn't know they were MOSFET, that sounds more inconvenient, but they do more than that I think. And when I say 'think', it means that I am not sure. It was just a rough calculation. 90A... each? Wow, that's a lot! probably why they are so expensive. They regulate load and current out what I am understanding. I know by disabling the iGPU improve the OC, don't know if it is possible with this mobo when with a GPU inserted... Still on a iGPU.

I know it's not only P-cores, disabling E-cores doesn't improve in some games, aldo they are not needed, but in some games they do. But I do learned something... the ring bus regulate the frequency on the load. Taking out the E-cores speedup frequency by 20% on the ring bus. Still they don't speed up the process that much.

So yeah the cooling is very important, but not with every task. So different configurations in bios stored on what you are playing could benefit you. So highly demanding games on the core, you could just use cpu lite with enough power and a lot's of cooling. With a low demanding game on the CPU you probably want to get more boost! Still they heat up fast. I have seen windows starting half of time with enough of power with manual tuning. But you don't want to use it for windows, but it indicates for me there are some improvements I could do when I need to speedup!
 
Last edited:

citay

Pro
SERGEANT
Joined
Oct 12, 2016
Messages
14,535
Yes, the VRM of a motherboard is a considerable cost factor, i have explained this here before.
 

P.D&n

Aim big from the start and roll out!
Joined
Nov 20, 2022
Messages
285
The problems are due to a faulty NVIDIA driver at the moment (which has existed for at least two driver versions):
That problem is going on for some long time. That why my last card was AMD, but you need to spend more than 600$. Still you have less cores.
 
Top