It is currently March 28th, 2024, 11:42 pm

Parsing a local .xml file

Get help with creating, editing & fixing problems with skins
User avatar
StArL0rd84
Posts: 424
Joined: February 8th, 2015, 10:07 pm
Location: EU, Denmark.

Parsing a local .xml file

Post by StArL0rd84 »

So i've made a YouTube skin that displays uploads from my subscribtions.
But can only show uploads from 10 subscribtions and those channel ID's are manually entered in.
But now i found a way to download a .xml file which contains the list of url's for the rss feeds from ALL my subscriptions!
And as a plus, that xml file updates so only the latest uploads sticks to the top of the list.

Got no problem downloading the .xml file with webparser, but trying to parse the .xml file is another story.
Found some documentation for parsing local files here: https://docs.rainmeter.net/tips/webparser-local-files/
I THINK i did everything right, but yeah, obviously not :D

Should give me something like: https://www.youtube.com/feeds/videos.xml?channel_id=UCxQbYGpbdrh-b2ND-AfIybg
But [mFeedGet1] is just blank for me.

Code: Select all

[mXMLDownload]
 Measure=Plugin
 Plugin=WebParser
 URL=https://www.youtube.com/subscription_manager?action_takeout=1
 UpdateRate=1800
 Download=1
 DownloadFile="subscription_manager.xml"

[mFeedGet1]
 Measure=Plugin
 Plugin=WebParser
 URL=file://#SKINSPATH##CURRENTCONFIG#\DownloadFile\subscription_manager.xml
 RegExp="(?siU)xmlUrl="(.*)"/>"
 UpdateRate=1800
 StringIndex=1
YtSubscribtionBoxTest_1.0.rmskin
You do not have the required permissions to view the files attached to this post.
([mWorkTime] = 1 ? #Work# : ([mEnergyLoss:%] >= 70% ? #Chillmode# : </>))
User avatar
jsmorley
Developer
Posts: 22628
Joined: April 19th, 2009, 11:02 pm
Location: Fort Hunt, Virginia, USA

Re: Parsing a local .xml file

Post by jsmorley »

StArL0rd84 wrote:So i've made a YouTube skin that displays uploads from my subscribtions.
But can only show uploads from 10 subscribtions and those channel ID's are manually entered in.
But now i found a way to download a .xml file which contains the list of url's for the rss feeds from ALL my subscriptions!
And as a plus, that xml file updates so only the latest uploads sticks to the top of the list.

Got no problem downloading the .xml file with webparser, but trying to parse the .xml file is another story.
Found some documentation for parsing local files here: https://docs.rainmeter.net/tips/webparser-local-files/
I THINK i did everything right, but yeah, obviously not :D

Should give me something like: https://www.youtube.com/feeds/videos.xml?channel_id=UCxQbYGpbdrh-b2ND-AfIybg
But [mFeedGet1] is just blank for me.

Code: Select all

[mXMLDownload]
 Measure=Plugin
 Plugin=WebParser
 URL=https://www.youtube.com/subscription_manager?action_takeout=1
 UpdateRate=1800
 Download=1
 DownloadFile="subscription_manager.xml"

[mFeedGet1]
 Measure=Plugin
 Plugin=WebParser
 URL=file://#SKINSPATH##CURRENTCONFIG#\DownloadFile\subscription_manager.xml
 RegExp="(?siU)xmlUrl="(.*)"/>"
 UpdateRate=1800
 StringIndex=1
YtSubscribtionBoxTest_1.0.rmskin
The problem is that you have them both happening at once. Try:

Code: Select all

[mXMLDownload]
 Measure=Plugin
 Plugin=WebParser
 URL=https://www.youtube.com/subscription_manager?action_takeout=1
 UpdateRate=1800
 Download=1
 DownloadFile="subscription_manager.xml"
 FinishAction=[!CommandMeasure mFeedGet1 "Update"]

[mFeedGet1]
 Measure=Plugin
 Plugin=WebParser
 URL=file://#CURRENTPATH#DownloadFile\subscription_manager.xml
 RegExp="(?siU)xmlUrl="(.*)"/>"
 UpdateRate=1800
 StringIndex=1
 
Edit: While this still won't work, see below post... It is important to understand that !Update does pretty much nothing on a WebParser measure. WebParser is not so much driven by the skin Update, or any UpdateDivider (don't use this!) of the measure, but the UpdateRate on the WebParser measure. The way it works is that the measure will "go out" to the remote resource and get the data every Update X UpdateRate milliseconds, so if Update=1000, then the default UpdateRate of 600 would be every 5 minutes.

To override that, to get the measure to "go out right now", you must use !CommandMeasure, with the "Update" parameter. Using !Update on it just gets you one skin update, normally one second, closer to UpdateRate being reached. To force it to "go out now" with !Update would means hammering it with 599 !Update commands. Probably not what you want to do.

https://docs.rainmeter.net/manual/plugins/webparser/#CommandMeasureUpdate
User avatar
jsmorley
Developer
Posts: 22628
Joined: April 19th, 2009, 11:02 pm
Location: Fort Hunt, Virginia, USA

Re: Parsing a local .xml file

Post by jsmorley »

The other problem is that this won't work...

While you can get at that XML in your browser, that is because you are "signed in" to YouTube in your browser. WebParser can't be "signed in", and you get a generic, "Who are you? Sign in or go away" version of the feed in Rainmeter.

As you can't sign in to YouTube (or your Google account in general) with HTTP authentication like:

https://YourAccount:YourPassword@www.youtube.com/subscription_manager?action_takeout=1

as you can to get your GMail feed, this will never work.

YouTube has no interest in displaying the list of all the pornography channels you subscribe to to the entire world, and I'm sort ok with that thinking.
User avatar
StArL0rd84
Posts: 424
Joined: February 8th, 2015, 10:07 pm
Location: EU, Denmark.

Re: Parsing a local .xml file

Post by StArL0rd84 »

jsmorley wrote:The other problem is that this won't work...

While you can get at that XML in your browser, that is because you are "signed in" to YouTube in your browser. WebParser can't be "signed in", and you get a generic, "Who are you? Sign in or go away" version of the feed in Rainmeter.

As you can't sign in to YouTube (or your Google account in general) with HTTP authentication like:

https://YourAccount:YourPassword@www.youtube.com/subscription_manager?action_takeout=1

as you can to get your GMail feed, this will never work.
But... i'm just extracting a link from the xml file to use elsewhere.
The file is local so i dont need to log on for that.

Also i tried logging off youtube and it still downloaded the right file.
I'm thinking big brother Google knows my IP.
Then when i try to download the file they know it's my specific subscription box xml to generate.
Just a theory ;P

Just need to know the right way to extract stuff from a xml file.
([mWorkTime] = 1 ? #Work# : ([mEnergyLoss:%] >= 70% ? #Chillmode# : </>))
User avatar
jsmorley
Developer
Posts: 22628
Joined: April 19th, 2009, 11:02 pm
Location: Fort Hunt, Virginia, USA

Re: Parsing a local .xml file

Post by jsmorley »

StArL0rd84 wrote:But... i'm just extracting a link from the xml file to use elsewhere.
The file is local so i dont need to log on for that.

Also i tried logging off youtube and it still downloaded the right file.
I'm thinking big brother Google knows my IP.
Then when i try to download the file they know it's my specific subscription box xml to generate.
Just a theory ;P


Nope. The only way this would work is if you are signed into your Google Account with Internet Explorer. Not Chrome, not Edge, not Firefox, but old-school Microsoft Internet Explorer. If you have a "cookie" from a site in IE, then WebParser, which uses calls to the IE API to work, will in fact use that cookie and the site you connect to will see you as signed in. Of course if you ever get "logged out" in IE, if the cookie expires, or some cleanup tool you use clears IE cookies, then the skin would stop working until you go to IE and sign in again.

HTTP is "stateless", there is no persistent connection maintained, and every HTTP request to a server is a brand-new, entirely anonymous thing. No matter what kind of complicated "authentication" a site uses, at the end of the day, they all boil down to some flavor of you sending an HTTP request, the site asking your browser or device "Ok, do you have a session cookie for this site, so we can know who your are?", and your browser or device answering "sure, here you go.". WebParser can't understand the question, and can't provide the answer. IE can, but that is a bad way to write a skin.

It can't be IP. First, most of us have dynamic IP addresses provided by our ISP, and they change all the time, and second, your external IP address points to your ISP, not to "you". That is why people who want to track you down by IP need to get court orders for your ISP. When you download that copy of Game of Thrones with BitTorrent, HBO is going to go to your ISP and ask "who had that IP address on this date and time?" If your ISP is any good, they won't tell HBO that, but they are likely to send you an email saying "Hey, HBO is bugging us about you, stop downloading Game of Thrones!". Even then, your external IP only points to your cable modem or router even from the view of your ISP. It's not "you" that downloaded Game of Thrones, but your little brother on this laptop. That's your story and you're sticking to it... But someone in your house did...

In any case, if we assume we lived in a world where YouTube can know that this HTTP request came specifically from StArL0rd84 in spite if the fact that they have no logical support for that assumption, parsing XML is no different than parsing anything else. It's all just "text" to WebParser. You just need to find the patterns you can search on, and capture the data you want. There really is no reason to "download" the XML and parse it as a local file that I can see. You are just downloading what WebParser already has.
User avatar
kyriakos876
Posts: 919
Joined: January 30th, 2017, 2:01 am
Location: Greece

Re: Parsing a local .xml file

Post by kyriakos876 »

jsmorley wrote:
In any case, if we assume we lived in a world where YouTube can know that this HTTP request came specifically from StArL0rd84 in spite if the fact that they have no logical support for that assumption, parsing XML is no different than parsing anything else. It's all just "text" to WebParser. You just need to find the patterns you can search on, and capture the data you want. There really is no reason to "download" the XML and parse it as a local file that I can see. You are just downloading what WebParser already has.
So, theoretically, we could parse sites like YouTube if Web-parser could manage cookies and not only read them?
User avatar
jsmorley
Developer
Posts: 22628
Joined: April 19th, 2009, 11:02 pm
Location: Fort Hunt, Virginia, USA

Re: Parsing a local .xml file

Post by jsmorley »

kyriakos876 wrote:So, theoretically, we could parse sites like YouTube if Web-parser could manage cookies and not only read them?
Theoretically, yes.