Timboli: No worries, thanks, I will check it out.
I'm still having download speed issues. Sometimes it is like yin and yang, with one download being an okay speed but the next quite slow, then maybe another faster download, then slow again etc. Though I get more slow downloads than okay speed ones, and it seems more often than not, that the BIN files download the slowest, which is painful with big files. It used to be that game extras downloaded the slowest, but for some time now I am finding they are usually the fastest.
Generally the download speed settles to a certain value and stays around that ... whether that's a good speed or slow speed, and I've not had any failures. I have tried stopping and restarting to overcome the slow download speeds, but I rarely if ever get a benefit from that.
The improved logging functionality (you won't benefit from the info logs when running the storage actions, but you should try the debug log level for the api commands, I think you'll find it a bit better to look at than before) and download retries are working quite well. I'm pleased. The tool is starting to approach something I'm satisfied with.
I cogitated about the various things related to the tool yesterday.
Storage Commands Refactoring:
That one is more for me.
I think the commands for the storage are more complicated than they need be. I think I should streamline the "storage apply" + "storage update-actions" + "storage resume" commands as follows:
storage apply manifest: It should either create an actions file or update it in the storage depending from a fresh manifest, nothing more (basically, part of what "storage apply" is doing and what "storage update-actions" is doing)
storage execute actions: It should run actions on the storage (basically, part of what "storage apply" is doing and what "storage resume" is doing)
Download File Api Command (to measure and improve download speed):
This, you'll be more interested in as it pertains to your issue.
I thought about providing a continuously updated download speed indication in real time as the file downloads to measure the speed (ie, something approaching a derivative), but this is more work.
At first, I think I'll just output the following metrics after the download is done: Size of the downloaded file, time to took to download it and size / time. Its not as engaging as something approaching a derivative in real time, but for a first iteration to measure performance, I think it is good enough.
After that (once we have a measurement), I need to figure out what to do to improve the download speed. The hoped for approach seems to be splitting the downloads, but there are various ways of doing that with various trade-offs.
1. Just use arbitrary breakpoints in range downloads
This approach is very forward compatible (ie, GOG.com is unlikely to break it, it is part of the http standard) and it allows for very custom download granularity (ie, you can decide you want 20MB, chunks, 50MB chunks, 100MB chunks, etc).
However, because the granularity is custom, GOG will no provide checksum values for the chunks so in the event of a bad download, it becomes impossible to know which chunk is bad (unless there is a size mismatch) and you got to restart the whole thing from scratch.
2. Use the chunks recommended in the xml body provided by GOG for the offline downloads
I've noticed that GOG provides an xml body with checksums for certain range queries, but I've yet to take advantage of it.
I have two reticences about it: The provided xml was malformed for at least one game (Torment if memory serves, there might be more) so there might be some games for which you can't use it. Also, I have the vague recollection that this xml was use in the now deprecated GOG downloader so I have no idea how well it will be supported by GOG.com in the future
3. Investigate the new Galaxy Endpoints
This will be very forward compatible, but I see 2 separate issues: I'm not sure if the cookie info I'm using will allow the full usage of those endpoints (I've noticed it works for some, I'm not sure about all) and also, I'm not sure if it is possible to retrieve the offline installers via those endpoints or if you can only get Galaxy installers
Anyways, no matter what I end up using, I think I'll first do a POC via the "gog-api download" command before I refactor the tool completely with a particular solution.