It is currently April 26th, 2024, 3:28 am

Rainmeter running on integrated GPU instead of the dedicated one

Report bugs with the Rainmeter application and suggest features.
User avatar
Yincognito
Rainmeter Sage
Posts: 7164
Joined: February 27th, 2015, 2:38 pm
Location: Terra Yincognita

Rainmeter running on integrated GPU instead of the dedicated one

Post by Yincognito »

I'm not sure if this is a bug or just some system reading used the wrong way, but on my new HP Pavilion 15 laptop that has 2 GPU cards (an AMD Graphics integrated one and a 4GB dedicated nVidia GeForce GTX 1650), it's the integrated card that is being used by Rainmeter in say, a skin that runs some 25 ms updated animation (notice how the former's core usage indicates it's being used, while the latter's zero core usage indicates the opposite):
GPU 0.jpg
GPU 1.jpg
This is despite, on one hand, having set the power plan to a performance one, and on the other hand, specifically setting the dedicated GPU to be used by Rainmeter in both Start / Settings / System / Display / Graphics Settings / Choose An App To Set Preference:
GPU Pref.jpg
and nVidia Control Panel (the AMD Radeon Software panel doesn't have a setting to exclude apps from using it):
GPU Nvidia.jpg
This is the relevant parrt of the Rainmeter log (I'm using Windows 10 version 21H1, build 19043, x64):

Code: Select all

DBUG (00:32:24.294) : * EnumDisplayDevices / EnumDisplaySettings API
DBUG (00:32:24.295) : \\.\DISPLAY1
DBUG (00:32:24.295) :   Name     : Generic PnP Monitor
DBUG (00:32:24.295) :   Adapter  : AMD Radeon(TM) Graphics
DBUG (00:32:24.296) :   Flags    : ACTIVE PRIMARY (0x00080005)
DBUG (00:32:24.296) :   Handle   : 0x0000000000010001
DBUG (00:32:24.296) :   ScrArea  : L=0, T=0, R=1920, B=1080 (W=1920, H=1080)
DBUG (00:32:24.296) :   WorkArea : L=0, T=0, R=1920, B=1050 (W=1920, H=1050)
DBUG (00:32:24.296) : \\.\DISPLAY2
DBUG (00:32:24.296) :   Adapter  : AMD Radeon(TM) Graphics
DBUG (00:32:24.297) :   Flags    : (0x00080000)
DBUG (00:32:24.297) : \\.\DISPLAY3
DBUG (00:32:24.297) :   Adapter  : NVIDIA GeForce GTX 1650
DBUG (00:32:24.297) :   Flags    : (0x00000000)
DBUG (00:32:24.297) : ------------------------------
DBUG (00:32:24.298) : * EnumDisplayMonitors API
DBUG (00:32:24.298) : \\.\DISPLAY1
DBUG (00:32:24.298) :   Flags    : PRIMARY (0x00000001)
DBUG (00:32:24.298) :   Handle   : 0x0000000000010001
DBUG (00:32:24.298) :   ScrArea  : L=0, T=0, R=1920, B=1080 (W=1920, H=1080)
DBUG (00:32:24.298) :   WorkArea : L=0, T=0, R=1920, B=1050 (W=1920, H=1050)
DBUG (00:32:24.299) : ------------------------------
DBUG (00:32:24.299) : * METHOD: EnumDisplayDevices + EnumDisplaySettings Mode
DBUG (00:32:24.299) : * MONITORS: Count=3, Primary=@1
DBUG (00:32:24.299) : @0: Virtual screen
DBUG (00:32:24.299) :   L=0, T=0, R=1920, B=1080 (W=1920, H=1080)
DBUG (00:32:24.299) : @1: \\.\DISPLAY1 (active), MonitorName: Generic PnP Monitor
DBUG (00:32:24.300) :   L=0, T=0, R=1920, B=1080 (W=1920, H=1080)
DBUG (00:32:24.300) : @2: \\.\DISPLAY2 (inactive), MonitorName: 
DBUG (00:32:24.300) : @3: \\.\DISPLAY3 (inactive), MonitorName: 
DBUG (00:32:24.300) : ------------------------------
So, the integrated GPU is marked as "active" in the log, thus used by Rainmeter - despite the dedicated GPU not only being set to be used in the relevant places, but having much more capabilities than the integrated one. Most of the other software choose the appropriate GPU correctly, but as you can see, some of them need some "guidance" on that (including Chrome), even though they correctly use the indicated GPU after being instructed as such. For some reason, Rainmeter doesn't seem to want / know / be able to do that. For normal skins, I wouldn't mind this, but for more "intensive" skins doing animations, it's beneficial to have a powerful GPU in charge of the visual rendering...

P.S. Setting Rainmeter to use the Hardware Acceleration seems to trigger the usage of the dedicated GPU, but then that's unfeasible due to the H.A. bug where parts or the entire image dissappears during the animation (that bug has been mentioned by others on the forum earlier, by the way; it happens when using a Bitmap meter, or an Image meter where the current frame is "cropped" from a "bitmap-like" image using the ImageCrop option, but does not happen when using an image sequence, which leads me to believe that it might be a memory problem).
You do not have the required permissions to view the files attached to this post.
Profiles: Rainmeter ProfileDeviantArt ProfileSuites: MYiniMeterSkins: Earth
User avatar
Brian
Developer
Posts: 2684
Joined: November 24th, 2011, 1:42 am
Location: Utah

Re: Rainmeter running on integrated GPU instead of the dedicated one

Post by Brian »

I came across something related to this recently, which got me thinking more about this issue. There is a DirectX option that would allow you to select a preferred adapter for rendering.
https://learn.microsoft.com/en-us/windows/win32/api/dxcore_interface/ne-dxcore_interface-dxcoreadapterpreference

The bad news....its for later versions of Windows 10.

So, searching around, I found some other ways to "prefer" the dedicated graphics adapter vs an integrated one. The problem is, I don't have a lot of time to test if this works.

Could you try this test build?
https://builds.rainmeter.net/test_builds/Rainmeter-4.5.17.3723-prerelease-GPU_Preference.exe

It should select the dedicated GPU by default assuming you have the appropriate settings enabled in Windows and/or nvidia control panel (or whatever AMD uses). You might have to mess around with different settings.

If you want to fine-tune the preference settings, you can add PreferAMD=0/1 and PreferNVIDIA=0/1 to your Rainmeter.ini [Rainmeter] section.. It defaults to 1 automatically.

-Brian

PS-Sorry it took so long to look at.
User avatar
Yincognito
Rainmeter Sage
Posts: 7164
Joined: February 27th, 2015, 2:38 pm
Location: Terra Yincognita

Re: Rainmeter running on integrated GPU instead of the dedicated one

Post by Yincognito »

Brian wrote: July 21st, 2023, 10:28 pm I came across something related to this recently, which got me thinking more about this issue. There is a DirectX option that would allow you to select a preferred adapter for rendering.
https://learn.microsoft.com/en-us/windows/win32/api/dxcore_interface/ne-dxcore_interface-dxcoreadapterpreference

The bad news....its for later versions of Windows 10.

So, searching around, I found some other ways to "prefer" the dedicated graphics adapter vs an integrated one. The problem is, I don't have a lot of time to test if this works.

Could you try this test build?
https://builds.rainmeter.net/test_builds/Rainmeter-4.5.17.3723-prerelease-GPU_Preference.exe

It should select the dedicated GPU by default assuming you have the appropriate settings enabled in Windows and/or nvidia control panel (or whatever AMD uses). You might have to mess around with different settings.

If you want to fine-tune the preference settings, you can add PreferAMD=0/1 and PreferNVIDIA=0/1 to your Rainmeter.ini [Rainmeter] section.. It defaults to 1 automatically.

-Brian

PS-Sorry it took so long to look at.
Hmm...no change. I tested all possible combinations between the 2 relevant settings in the NVidia panel, i.e. Use global setting (Auto-select: Integrated) and High-performance NVIDIA processor, and the 2 relevant settings in Windows Graphics Settings, i.e. Let Windows decide and High Performance, as well as adding first PreferNVIDIA=1 then PreferAMD=0 as well into Rainmeter.ini's [Rainmeter] section, but the NVidia GPU is only above 0% when Hardware Acceleration is enabled in Rainmeter, or when some external software plugin like WebView is used (this despite MSEdge.exe being set to run on "Power Saving" aka the integrated AMD card in Settings - albeit it's more than just the MSEdge executable that's involved in WebView, which would possibly explain the contradiction).

Of course, just like before, the behavior is the expected one when HA is on, it just isn't replicated when HA is off. Thinking in retrospective and according to the usual understanding of such things which says it's either the CPU (e.g. HA off) or the GPU (e.g. HA on) taking the burden of graphics, it might just be that when HA is off only the CPU and maybe the default integrated GPU showing the desktop are involved, as opposed to when HA is on and actual choosing between video cards (including the dedicated one) is possible. Not sure if this makes much sense, but I can't explain it another way either.

For the record, when I posted this I used a 9 GB skin to test (yeah, I know, but it was mostly tons of frame images along with some mosaic ones), while now I used a similar but smaller skin with a code like this:

Code: Select all

[Variables]
Update=25
Frames=240
Cols=16
Rows=15
Speed=1
Size=414
Edge=1
Glow=0.0157
EC=0,0,0,255

[Rainmeter]
Update=#Update#
AccurateText=1
DynamicWindowSize=1
OnRefreshAction=[!SetWindowPosition "49.90%" "48.85%" "50.00%" "50.00%"]

---Measures---

[Frame]
Measure=Calc
Formula=((#Frames#+Frame+#Speed#)%#Frames#)

---Meters---

[Container]
Meter=Shape
Shape=Ellipse (#Size#/2),(#Size#/2),(#Size#/2-#Edge#),(#Size#/2-#Edge#) | Fill RadialGradient EllipseGradient | StrokeWidth 0 | Stroke Color #EC#
EllipseGradient=0,0 | 0,0,0,255 ; 0.0 | 0,0,0,255 ; (1-#Glow#/(1+#Glow#)) | 0,0,0,128 ; (1-#Glow#/(1+#Glow#)) | 0,0,0,0 ; 1.0
UpdateDivider=-1

[Earth]
Container=Container
Meter=Image
ImageName=#@#Blue Marble.jpg
ImageCrop=(Trunc([Frame]%#Cols#)*#Size#),(Trunc([Frame]/#Cols#)*#Size#),#Size#,#Size#
DynamicVariables=1
which uses a (414 x 16) x (414 x 15) mosaic JPEG image as a source, basically the below enlarged 10 (Horizontal) x 10 (Vertical) times:
Blue Marble - Small.jpg
The measurements in the screenshots were done via the MSI Afterburner plugin and are more or less confirmed by the GPU alias of UsageMonitor.

As a curiosity, when running on the integrated AMD card, the GPU Core is at 4% and the GPU Alias in UsageMonitor is at around 2% when HA is off, while the GPU Core is at 9% and the GPU Alias in UsageMonitor is at around 5% when HA is on. However, when running on the more powerful dedicated NVidia card, the GPU Core is at 17% and the GPU Alias in UsageMonitor is at around 19% (HA on), so a greater usage than when running on the less powerful integrated card. Anyway, this was mentioned only as a fun fact, I've given up long ago trying to understand the mathematics displayed in Windows...

P.S. I didn't restart the computer when testing, but I don't think it would matter since this is supposed to work without any restart needed. Of course, I exited Rainmeter before changing the GPU settings in both places, followed by running it again.
P.S.S. Don't worry about the time it took to answer this, the more important thing is for the report to have been read. ;-)
You do not have the required permissions to view the files attached to this post.
Profiles: Rainmeter ProfileDeviantArt ProfileSuites: MYiniMeterSkins: Earth
User avatar
Brian
Developer
Posts: 2684
Joined: November 24th, 2011, 1:42 am
Location: Utah

Re: Rainmeter running on integrated GPU instead of the dedicated one

Post by Brian »

Thanks for testing!
Yincognito wrote: July 22nd, 2023, 1:55 am Of course, just like before, the behavior is the expected one when HA is on, it just isn't replicated when HA is off. Thinking in retrospective and according to the usual understanding of such things which says it's either the CPU (e.g. HA off) or the GPU (e.g. HA on) taking the burden of graphics, it might just be that when HA is off only the CPU and maybe the default integrated GPU showing the desktop are involved, as opposed to when HA is on and actual choosing between video cards (including the dedicated one) is possible. Not sure if this makes much sense, but I can't explain it another way either.
Yeah, with HA off, Rainmeter should use mostly CPU, but I do wonder if the system offloads some things to the GPU....especially if the monitor is directly plugged into dedicated card instead of the integrated card. I have no idea how a laptop with 2 adapters (both integrated and dedicated) would manage this.
Yincognito wrote: July 22nd, 2023, 1:55 am P.S. I didn't restart the computer when testing, but I don't think it would matter since this is supposed to work without any restart needed. Of course, I exited Rainmeter before changing the GPU settings in both places, followed by running it again.
Yeah, you shouldn't have to restart your machine at all. Exiting and restarting Rainmeter is all that is needed.

Well, I will try to revisit this when we drop support for Windows 7/8/8.1.

-Brian
User avatar
Active Colors
Moderator
Posts: 1254
Joined: February 16th, 2012, 3:32 am
Location: Berlin, Germany

Re: Rainmeter running on integrated GPU instead of the dedicated one

Post by Active Colors »

Brian wrote: July 22nd, 2023, 8:07 am Well, I will try to revisit this when we drop support for Windows 7/8/8.1.
Can I suggest to have a gradual support drop instead of cutting it out with one machete swoosh? One of the ideas is to make a separate Rainmeter beta “Windows 10/11 only” version while also maintaining the current version for a while.
User avatar
Yincognito
Rainmeter Sage
Posts: 7164
Joined: February 27th, 2015, 2:38 pm
Location: Terra Yincognita

Re: Rainmeter running on integrated GPU instead of the dedicated one

Post by Yincognito »

Brian wrote: July 22nd, 2023, 8:07 am Thanks for testing!


Yeah, with HA off, Rainmeter should use mostly CPU, but I do wonder if the system offloads some things to the GPU....especially if the monitor is directly plugged into dedicated card instead of the integrated card. I have no idea how a laptop with 2 adapters (both integrated and dedicated) would manage this.


Yeah, you shouldn't have to restart your machine at all. Exiting and restarting Rainmeter is all that is needed.

Well, I will try to revisit this when we drop support for Windows 7/8/8.1.

-Brian
No worries, all good. I have no idea of the internals of this either, since this was my first time having a dual video card system (I avoided it like the plague before, for pretty much the same reason, but the specs and the price were too good to not consider buying the laptop this time). Many times I have the feeling that Windows expects to run the desktop apps on the integrated card and the D3D aka games and such on the dedicated one, even though most of the times it respects my preferences in that regard.
Profiles: Rainmeter ProfileDeviantArt ProfileSuites: MYiniMeterSkins: Earth