It is currently October 22nd, 2021, 10:18 am

Rainmeter running on integrated GPU instead of the dedicated one

Report bugs with the Rainmeter application and suggest features.
User avatar
Yincognito
Rainmeter Sage
Posts: 4070
Joined: February 27th, 2015, 2:38 pm
Location: Terra Yincognita

Rainmeter running on integrated GPU instead of the dedicated one

Post by Yincognito »

I'm not sure if this is a bug or just some system reading used the wrong way, but on my new HP Pavilion 15 laptop that has 2 GPU cards (an AMD Graphics integrated one and a 4GB dedicated nVidia GeForce GTX 1650), it's the integrated card that is being used by Rainmeter in say, a skin that runs some 25 ms updated animation (notice how the former's core usage indicates it's being used, while the latter's zero core usage indicates the opposite):
GPU 0.jpg
GPU 1.jpg
This is despite, on one hand, having set the power plan to a performance one, and on the other hand, specifically setting the dedicated GPU to be used by Rainmeter in both Start / Settings / System / Display / Graphics Settings / Choose An App To Set Preference:
GPU Pref.jpg
and nVidia Control Panel (the AMD Radeon Software panel doesn't have a setting to exclude apps from using it):
GPU Nvidia.jpg
This is the relevant parrt of the Rainmeter log (I'm using Windows 10 version 21H1, build 19043, x64):

Code: Select all

DBUG (00:32:24.294) : * EnumDisplayDevices / EnumDisplaySettings API
DBUG (00:32:24.295) : \\.\DISPLAY1
DBUG (00:32:24.295) :   Name     : Generic PnP Monitor
DBUG (00:32:24.295) :   Adapter  : AMD Radeon(TM) Graphics
DBUG (00:32:24.296) :   Flags    : ACTIVE PRIMARY (0x00080005)
DBUG (00:32:24.296) :   Handle   : 0x0000000000010001
DBUG (00:32:24.296) :   ScrArea  : L=0, T=0, R=1920, B=1080 (W=1920, H=1080)
DBUG (00:32:24.296) :   WorkArea : L=0, T=0, R=1920, B=1050 (W=1920, H=1050)
DBUG (00:32:24.296) : \\.\DISPLAY2
DBUG (00:32:24.296) :   Adapter  : AMD Radeon(TM) Graphics
DBUG (00:32:24.297) :   Flags    : (0x00080000)
DBUG (00:32:24.297) : \\.\DISPLAY3
DBUG (00:32:24.297) :   Adapter  : NVIDIA GeForce GTX 1650
DBUG (00:32:24.297) :   Flags    : (0x00000000)
DBUG (00:32:24.297) : ------------------------------
DBUG (00:32:24.298) : * EnumDisplayMonitors API
DBUG (00:32:24.298) : \\.\DISPLAY1
DBUG (00:32:24.298) :   Flags    : PRIMARY (0x00000001)
DBUG (00:32:24.298) :   Handle   : 0x0000000000010001
DBUG (00:32:24.298) :   ScrArea  : L=0, T=0, R=1920, B=1080 (W=1920, H=1080)
DBUG (00:32:24.298) :   WorkArea : L=0, T=0, R=1920, B=1050 (W=1920, H=1050)
DBUG (00:32:24.299) : ------------------------------
DBUG (00:32:24.299) : * METHOD: EnumDisplayDevices + EnumDisplaySettings Mode
DBUG (00:32:24.299) : * MONITORS: Count=3, Primary=@1
DBUG (00:32:24.299) : @0: Virtual screen
DBUG (00:32:24.299) :   L=0, T=0, R=1920, B=1080 (W=1920, H=1080)
DBUG (00:32:24.299) : @1: \\.\DISPLAY1 (active), MonitorName: Generic PnP Monitor
DBUG (00:32:24.300) :   L=0, T=0, R=1920, B=1080 (W=1920, H=1080)
DBUG (00:32:24.300) : @2: \\.\DISPLAY2 (inactive), MonitorName: 
DBUG (00:32:24.300) : @3: \\.\DISPLAY3 (inactive), MonitorName: 
DBUG (00:32:24.300) : ------------------------------
So, the integrated GPU is marked as "active" in the log, thus used by Rainmeter - despite the dedicated GPU not only being set to be used in the relevant places, but having much more capabilities than the integrated one. Most of the other software choose the appropriate GPU correctly, but as you can see, some of them need some "guidance" on that (including Chrome), even though they correctly use the indicated GPU after being instructed as such. For some reason, Rainmeter doesn't seem to want / know / be able to do that. For normal skins, I wouldn't mind this, but for more "intensive" skins doing animations, it's beneficial to have a powerful GPU in charge of the visual rendering...

P.S. Setting Rainmeter to use the Hardware Acceleration seems to trigger the usage of the dedicated GPU, but then that's unfeasible due to the H.A. bug where parts or the entire image dissappears during the animation (that bug has been mentioned by others on the forum earlier, by the way; it happens when using a Bitmap meter, or an Image meter where the current frame is "cropped" from a "bitmap-like" image using the ImageCrop option, but does not happen when using an image sequence, which leads me to believe that it might be a memory problem).
You do not have the required permissions to view the files attached to this post.