It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
Thanks for the update.

I guess it is a bit complex to check version number of a patch against that of a non patch installer, and might not work in every case anyway. Personally I'm not that bothered about leaving the patch entries there ... not unless the size of my manifest grows to a point where removal might be desirable to speed things up etc.

Who knows with that last, as I am still mostly entering new manifest entries on-the-fly, and many of my games, probably most, are yet to be imported. I have a ton of updates to get, so that will likely add a large bunch when I get around to it.
avatar
Timboli: Thanks for the update.

I guess it is a bit complex to check version number of a patch against that of a non patch installer, and might not work in every case anyway. Personally I'm not that bothered about leaving the patch entries there ... not unless the size of my manifest grows to a point where removal might be desirable to speed things up etc.

Who knows with that last, as I am still mostly entering new manifest entries on-the-fly, and many of my games, probably most, are yet to be imported. I have a ton of updates to get, so that will likely add a large bunch when I get around to it.
Well, I have two separate backups.

I have my real backup on a distributed minio that is on 4 separate machines with ~80TB of redundant disk space (which I don't leave on all the time because of the electricity usage) and I have my slim service single-instance minio with a single 12TB disk that I use as a mirror to download games without having to download games I want to install from gog (and that one I leave on most of the time).

My 12TB service instance was full, but clearing the patch files for a couple of games, I freed 250gb of space (haven't done all the games I could yet, I might be able to double that). I'll probably have to add another disk to that minio, but I'm postponing that as much as I can.
Post edited August 30, 2023 by Magnitus
avatar
Magnitus: Well, I have two separate backups.
Me, I have four backups in all, and I just keep buying new portable drives as the need arises.
I have them organized very specifically, and so it works well for me. I have them stored in various places around the house and even a copy up in the shed, at some distance from the house.

No doubt I could save quite a bit of space if I removed all unnecessary patches and even updates. I don't have the mindset to bother with that right now, and drives are relatively cheap. I also have three kids to pass things onto one day.

I could have used a NAS, but got turned off them years ago, after trying a 2 Bay Synology for a couple of years. It still works, still has the original two drives in it, full & mirrored, but now I just treat it as one of my movie etc backups, and I rarely turn it on. Unless you are going to run multiple NAS and in very different physical locations, they are just like having all your eggs in one basket to me.

But hey, each to their own. :)
New release.

v0.23.0:

- Added a resume command for storage repairs in case it gets interrupted (ex: power outage)
- Added flag to use storage manifest filter when doing manifest apply (useful to keep patch and language filters for specific games)
- Added command to list games missing from your manifest, mostly to double-check (ex: games that have no files because of your language filters, like the german-only version of wolfenstein ii or because they are not yet released like Resident Evil 2&3 at the time of this writing)
- Added command to filter out certain languages for a specific game (based on the need of a user in one of the issues)
- Added macos arm64 to release binaries
Post edited August 02, 2024 by Magnitus
Thanks for the update.

I've been meaning to contact you in regard to the possibility of choosing which CDN we download from.

As you are no doubt aware, we are currently limited with gogcli.exe and gogrepo.py and our browsers, to using the Fastly CDN.

From what I've read, Galaxy doesn't use Fastly, and neither does Lgogdownloader, and in the parameters for Lgogdownloader there is a choice of about 5 CDN, and you can set a priority order for them.

it would be great to have something like that with gogcli.exe, as Fastly seems deliberately limited, badly so for many of us.

While I still use gogcli.exe for my manifest needs etc, I've had to turn to using curl.exe plus either aria2c.exe or fdm.exe to do the downloading. I do that to get multiple download threads per file. Without them I am limited to a maximum of 1.3 Megabytes a second download speed, often less. I never get more than that for a single thread now, whereas before that CDN issue/crash in the middle of last year, I was getting around 5 Megabytes a second with a single thread.

My thinking is that we weren't using Fastly CDN before that, and that it is GOG's fallback CDN, and now forced upon those of us who don't use Galaxy It appears Lgogdownloader can get around that limitation.
avatar
Timboli: Thanks for the update.

I've been meaning to contact you in regard to the possibility of choosing which CDN we download from.

As you are no doubt aware, we are currently limited with gogcli.exe and gogrepo.py and our browsers, to using the Fastly CDN.

From what I've read, Galaxy doesn't use Fastly, and neither does Lgogdownloader, and in the parameters for Lgogdownloader there is a choice of about 5 CDN, and you can set a priority order for them.

it would be great to have something like that with gogcli.exe, as Fastly seems deliberately limited, badly so for many of us.

While I still use gogcli.exe for my manifest needs etc, I've had to turn to using curl.exe plus either aria2c.exe or fdm.exe to do the downloading. I do that to get multiple download threads per file. Without them I am limited to a maximum of 1.3 Megabytes a second download speed, often less. I never get more than that for a single thread now, whereas before that CDN issue/crash in the middle of last year, I was getting around 5 Megabytes a second with a single thread.

My thinking is that we weren't using Fastly CDN before that, and that it is GOG's fallback CDN, and now forced upon those of us who don't use Galaxy It appears Lgogdownloader can get around that limitation.
I found it (very nice codebase btw):
https://github.com/Sude-/lgogdownloader/blob/master/src/galaxyapi.cpp#L593
https://github.com/Sude-/lgogdownloader/blob/master/include/globalconstants.h#L112

Eventually, I'd be interested in exploring the Galaxy API and see what kind of optimisation I could make with that, but I've been interested in it for a while and have just been quite busy with my life & career.

At this point, with a new(ish) baby, plus another project I got going on, realistically, it might be some time (measured at least in months, maybe years) before I finally tackle this.

At this point, the main roadblock with this issue for me would be dealing with the recaptcha and the login to get a valid Galaxy token. If I integrate that, I want it to be:
- Dependable
- Cross-platform (in this day and age, all client software should run at least on Linux, Windows and Macos, no excuse, enough with the os fragmentation of desktop software)

If I go with a pure html processing solution (so that users pass their username and password and not deal with a gui), it might not be as dependable as I'd like thanks to the recaptcha and if I go with a ui integration, it might not be as cross-platform as I'd like (maybe with Tauri, I know Rust at this point, but I'd still have to rampup on the framework itself and it would be another separate tool from the client just to get the token although I'm hoping to eventually optionally embed the gogcli client in a tauri app for a graphical experience).

I might ultimately combine an html-processing solution with the Galaxy api and keep around the cookie based solution (which I know is a pain in the neck to get, but at least it is dependable and I like that a lot about it) as a backup for dependability. I'll see.
Post edited August 05, 2024 by Magnitus
avatar
Magnitus: I found it (very nice codebase btw):
https://github.com/Sude-/lgogdownloader/blob/master/src/galaxyapi.cpp#L593
https://github.com/Sude-/lgogdownloader/blob/master/include/globalconstants.h#L112

Eventually, I'd be interested in exploring the Galaxy API and see what kind of optimisation I could make with that, but I've been interested in it for a while and have just been quite busy with my life & career.

At this point, with a new(ish) baby, plus another project I got going on, realistically, it might be some time (measured at least in months, maybe years) before I finally tackle this.

At this point, the main roadblock with this issue for me would be dealing with the recaptcha and the login to get a valid Galaxy token. If I integrate that, I want it to be:
- Dependable
- Cross-platform (in this day and age, all client software should run at least on Linux, Windows and Macos, no excuse, enough with the os fragmentation of desktop software)

If I go with a pure html processing solution (so that users pass their username and password and not deal with a gui), it might not be as dependable as I'd like thanks to the recaptcha and if I go with a ui integration, it might not be as cross-platform as I'd like (maybe with Tauri, I know Rust at this point, but I'd still have to rampup on the framework itself and it would be another separate tool from the client just to get the token although I'm hoping to eventually optionally embed the gogcli client in a tauri app for a graphical experience).

I might ultimately combine an html-processing solution with the Galaxy api and keep around the cookie based solution (which I know is a pain in the neck to get, but at least it is dependable and I like that a lot about it) as a backup for dependability. I'll see.
No worries, and thanks for considering it, and maybe we will get lucky.

Babies sure do wear you out.
I have my daughter and baby grandson living with us, meaning I often play the surrogate role of father as well as grandfather. He's kind of taken over the house, and I've had to use muscles etc that I stopped using years ago., as well as all the additional noise and demands etc. Not what i expected in my mid 60s.

I take it you can't just use the cookie approach, without much additional work? I am happy enough with using a cookie text file. In fact I use a second different one with curl.exe.

In any case, I can cope with what I am currently having to do. It would be nice though, to just simplify things and only have to rely on using gogcli.exe. There are timing issues etc related to using Free Download Manager 5 in the mix.
New release.

v0.24.0:

- Made handling of fake file label listings in gog game files more robust by trimming out files of size 0 (gog returning 500 now on some of those entries convinced me to make the adjustment)
- Adjusted for a api change on gog's part where they removed one redirection step to get a file's metadata
- Added support for cookie strings and json cookie format exporter from firefox
BIG THANKS for the continuing improvements.
Hello Magnitus.

I am currently having a game name issue with the game Onde, when trying to generate a manifest entry for it.

It appears it isn't unique enough, and this is with or without being quoted.

And because one of the seven games found has a manifest issue (bad metadata), generating the manifest fails at that point, before even getting to the desired game.

The following is the error I am getting.

[manifest writer] Generating/Updating manifest for 7 games
[manifest writer] Got all info on game with id 1078350245. 6 games left to process
getDownloadFileInfo(downloadPath=/downloads/age_of_wonders_2_the_wizards_throne/8533) -> Error location header url does not have the expected path query parameter


It would be good if the manifest generation could continue despite any errors, especially if some kind of comparison on exact game name was implemented. My code normally just extracts the right portion of the return based on Game ID, but that cannot happen of course, if the return is being truncated before the desired game is reached.

The first unwanted game being returned, which it fails after, is Mondealy Demo.
Going by the error above, it would appear the second unwanted game is Age Of Wonders 2 The Wizards Throne, which it fails with.
Post edited September 23, 2024 by Timboli
avatar
Timboli: Hello Magnitus.

I am currently having a game name issue with the game Onde, when trying to generate a manifest entry for it.

It appears it isn't unique enough, and this is with or without being quoted.

And because one of the seven games found has a manifest issue (bad metadata), generating the manifest fails at that point, before even getting to the desired game.

The following is the error I am getting.

[manifest writer] Generating/Updating manifest for 7 games
[manifest writer] Got all info on game with id 1078350245. 6 games left to process
getDownloadFileInfo(downloadPath=/downloads/age_of_wonders_2_the_wizards_throne/8533) -> Error location header url does not have the expected path query parameter


It would be good if the manifest generation could continue despite any errors, especially if some kind of comparison on exact game name was implemented. My code normally just extracts the right portion of the return based on Game ID, but that cannot happen of course, if the return is being truncated before the desired game is reached.

The first unwanted game being returned, which it fails after, is Mondealy Demo.
Going by the error above, it would appear the second unwanted game is Age Of Wonders 2 The Wizards Throne, which it fails with.
I'll have to take a look at those games.

For continuing despite errors, I can see why some people would want that, though I do not want that for myself (and implementing the two paths would take longer to implement), because if a game file is bad, I want to be notified about it in no uncertain terms so that I can take action (usually contact GOG support). I suppose I could add some kind of error output file so that an automated tool running on top of mine could parse it and decide what action to take.

Once I've fixed the bad metadata for the extras, generating the manifest should go back to taking about 30 minutes, plus there is a resume command to resume halted manifest generation/update so you don't need to restart from scratch.

Otherwise, if you want to skip a particular file when resuming manifest generation (something I've done in the past when it was a patch file I did not care about that was bad), something you can do is add the download url of the bad file to the urls to skip. I guess I could add a flag to the resume commands to do that automatically.
Post edited September 24, 2024 by Magnitus
When looking to see what could help solve my issue, I came across your game-details command last night, which I don't recall noticing before, let alone testing.

It occurred to me I could use the result of that to start building the basis of a manifest entry for a game, using your path-info command to help flesh out missing details, and where that fails, curl. I spent most of today developing that idea, and it works well in my testing so far. So I guess you could call that my work-around or alternate generate manifest method. It alas cannot get the checksum value for an installer file, if bad metadata is returned for that, but it can get what I need for downloading the file (file name, file size). Such a file can then be checked using InnoExtract instead, which is a longer process, but needs must.

EDIT
And then things got awfully complex.
I've just bought Darkest Dungeon and all its DLCs, and the entries are all over the place with the game-details command, so I no doubt now need to wrestle with them for a few hours, most likely. All trial and error with my testing, until I get a bigger grasp of possibilities and cater for each.

I am having bad metadata issues with too many games now, as it often seems like every second game. And more often than not, it is just an Extras file or three causing the issue. Those can be gotten around with using curl.exe because there is no checksum involved.
Post edited September 25, 2024 by Timboli
avatar
Timboli: And then things got awfully complex.
I've just bought Darkest Dungeon and all its DLCs, and the entries are all over the place with the game-details command, so I no doubt now need to wrestle with them for a few hours, most likely. All trial and error with my testing, until I get a bigger grasp of possibilities and cater for each.
Well I had a crack at it, and improved my code and approach along the way, and hopefully it will now cater for all other variations.

Three variations tested so far.
1. Plain game (with patches).
2. Game plus soundtrack extra DLC.
3. Game plus DLC plus Extras ... the extras also included a Soundtrack DLC.

I'm still pondering whether to add in-program support to my GUI for InnoExtract testing, where a checksum is missing, along with 7-Zip for other OS files, or just independently use my GOGPlus Download Checker program or support that via command-line.
The list of bad metadata games keeps growing.

Wasteland 3, for which I had just bought the Deluxe Edition Upgrade. Failed with an Extra.

Contraption Maker for which I had just bought the Wonderstructs DLC. Failed with that DLC.

Shadow Tactics - Blades of the Shogun - Aiko’s Choice for which I had just bought the Soundtrack DLC. That failed with the main Linux download checksum. I think Linux may have been recently added for the game.

And that's just the first three of the many I need to get right now. So not an encouraging or good ratio so far, with 3 out of 3 failing.

On the bright side, my new alternate generate manifest code is still working well ... successful for 6 games now. And only one file was a missing checksum, that will require using 7-Zip to check the SH archive.

EDIT

Not all bad. Faraday Protocol (just bought), passed.

That said, A Night at the Watermill (just bought), failed and crashed my GUI with the Alternate code. It was an Extra that failed before the crash. I'll have to investigate the crash tomorrow, as I am all coded out right now ... worked on an update for GOGcli GUI and another program earlier today. So I will have to skip that game download for now.

Unwording and Retreat to Enen and Monorail Stories and Clid the Snail and Metro Simulator 2 (all just bought), passed.

Ghost on the Shore Demo (just grabbed), passed.

Garden Simulator (just bought), failed with an Extra.

Partisans 1941 (just bought), failed with an Extra and crashed my GUI.

I don't know why, but it seems to be mostly Extras or other OS files, that often cause a bad metadata failure. Is someone at GOG not taking the care they should be, or perhaps a flaw in their game (extras/OS) adding code?
Post edited September 28, 2024 by Timboli
That crash mentioned last post, was a silly mistake of mine. I'd overlooked renaming a variable that was already in use. It took about five seconds to fix, renaming 3 instances and declaring the new variable.

A Night at the Watermill (just bought), failed with an Extra.

Partisans 1941 (just bought), failed with an Extra.

Both those games then passed with flying colors, bad metadata for an Extra aside.

So that would be a dozen or so games now that needed to use my alternate generate method, where I have had 100% success with my revamped code. Let's hope that continues.

I've also started work on implementing 7-Zip, which my GUI already uses if present, for when a checksum is missing for DMG, PKG and SH files. I've also coded the initial elements to support both InnoExtract (EXE + BIN files) and UnRAR for individual BIN files, though I am not going to work on them yet. Extra support for 7-Zip is relatively easy to add, and I am also including an option to create an MD5 value when 7-Zip reports success. The additional validating, will occur after the regular validating for all other downloads has occurred, and will possibly be in response to a prompt to the user. I have a bunch of ideas about how best it might be to do what could be a lengthy additional process ... imagine a 15 GB Linux or Mac file. I don't recall how fast 7-Zip processes something like that, though it will vary depending on the power of the PC being used, in some cases a lot.
Post edited September 29, 2024 by Timboli