So, I'm curious what is actually happening here. This occurs on occasion in situations where a game is loading something in, so obviously that's relevant, though sometimes it can be at seemingly total random - highest I've seen was 2460%, meanwhile any other time it's showing normal expected percentages, such as 36.2%.
But clearly it's impossible to actually use nearly 11+ times the capacity of a graphics card, right? So what exactly is the data that it's pulling that would give such ludicrous numbers?
Here's the code in case it's relevant.
Code: Select all
[Rainmeter] Update=100 Background=#@#Background.png BackgroundMode=3 BackgroundMargins=0,34,0,14 [measureGPU] Measure=Plugin Plugin=UsageMonitor Alias=GPU [meterLabelGPU] Meter=String MeterStyle=styleLeftText X=20 Y=65 ;OFFSET1 W=190 H=14 Text=GPU Usage [meterValueGPU] Meter=String MeterStyle=styleRightText X=210 Y=0r W=190 H=14 FontColor=255,255,255,255 Text=[measureGPU:1]% Percentual=1 DynamicVariables=1 [meterBarGPU] Meter=Histogram MeasureName=measureGPU X=20 Y=77 ;OFFSET2 W=190 H=40 PrimaryColor=200,255,150,255 SolidColor=255,255,255,100