It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
MarkoH01: So you are telling me that Windows is not using virtual memory at all when booting?
avatar
WinterSnowfall: Not unless it has no other choice, in which case you are most likely below the recommended specs or are running a dangerously bloated system and it's time for a reinstall.
Well, that (bloated unoptimized) could have been the case with my mothers PC as well ;)
But honestly - it took several minutes until I was finally able to access this PC ... I am not lying. That's why I still remember this. When I came back home and booted my own PC it was like another world.
avatar
WinterSnowfall: Not unless it has no other choice, in which case you are most likely below the recommended specs or are running a dangerously bloated system and it's time for a reinstall.
avatar
MarkoH01: Well, that (bloated unoptimized) could have been the case with my mothers PC as well ;)
But honestly - it took several minutes until I was finally able to access this PC ... I am not lying. That's why I still remember this. When I came back home and booted my own PC it was like another world.
That was probably due to fragmentation of the files on the HDD, all other things being optimal.

Here's a list of stress tests someone did with Win XP, Vista, 7, 8, and 10 btw.
(Minimum specs to run)

https://intosanctuary.com/index.php?threads/windows-xp-professional-x86-sp2-torture-test.914/

https://intosanctuary.com/index.php?threads/windows-vista-ultimate-x64-sp2-torture-test.916/

https://intosanctuary.com/index.php?threads/windows-7-ultimate-x64-sp1-torture-test.917/

https://intosanctuary.com/index.php?threads/windows-8-1-professional-x64-torture-test.918/

https://intosanctuary.com/index.php?threads/windows-10-professional-x64-build-2004-torture-test.919/

--
Oh yeah, and I'm running Win 8 with 4GB RAM. ,)
(I'd love to still be on XP though. Best Win OS ever.)
avatar
kohlrak: 2gb, 512mb of which goes to my GPU. That and consider that if you want to play a game, Galaxy's resource hogging will take resources away from that game. You can have 8gb, but if you have a background process constantly getting cycles that uses 6gb, you only have 2gb left for the OS and the like. Amazing what people gain simply from closing their browser.
avatar
MarkoH01: You are running a windows OS on 2GB? Wow! It must take ages until the PC has even finished booting (once witnessed this on a 4GB PC with WinXP and it was no fun).

@everybody who chimed in to tell me that my assumption was wrong - my apologies. Consider me surprised, I honestly did not expect that so many are still running with 4GB or even less.
IIRC, windows 32bit (most likely what you'll see with windows xp) is capped at 2GB, because of absolute pointers (windows might not want to manage paging) and a strong and reasonable desire to avoid utilizing the sign bit (thus making 32bit windows 31 bit instead). This, in turn, would limit it to 2GB.

Also, the bottleneck in a windows bootup isn't the RAM but the hard drive. Booting up a windows computer with a solid state drive improves booting time significantly.
avatar
MarkoH01: Well, that (bloated unoptimized) could have been the case with my mothers PC as well ;)
But honestly - it took several minutes until I was finally able to access this PC ... I am not lying. That's why I still remember this. When I came back home and booted my own PC it was like another world.
avatar
Vendor-Lazarus: That was probably due to fragmentation of the files on the HDD, all other things being optimal.

Here's a list of stress tests someone did with Win XP, Vista, 7, 8, and 10 btw.
(Minimum specs to run)

https://intosanctuary.com/index.php?threads/windows-xp-professional-x86-sp2-torture-test.914/

https://intosanctuary.com/index.php?threads/windows-vista-ultimate-x64-sp2-torture-test.916/

https://intosanctuary.com/index.php?threads/windows-7-ultimate-x64-sp1-torture-test.917/

https://intosanctuary.com/index.php?threads/windows-8-1-professional-x64-torture-test.918/

https://intosanctuary.com/index.php?threads/windows-10-professional-x64-build-2004-torture-test.919/

--
Oh yeah, and I'm running Win 8 with 4GB RAM. ,)
(I'd love to still be on XP though. Best Win OS ever.)
I remember a fun article where comparable tasks like checking email and such were used to pit an 80s macintosh against a mid-2000s windows XP machine, and windows lost. Goold ol' Wirth's Law.
Post edited February 21, 2021 by kohlrak
avatar
kohlrak: IIRC, windows 32bit (most likely what you'll see with windows xp) is capped at 2GB, because of absolute pointers (windows might not want to manage paging) and a strong and reasonable desire to avoid utilizing the sign bit (thus making 32bit windows 31 bit instead). This, in turn, would limit it to 2GB.
2 Gb per application, not total, and I think there's some optional patch or tool that removes that limit. The maximum usable is some 3.5 Gb, though that can be lowered by graphics.
avatar
kohlrak: 2gb, 512mb of which goes to my GPU. That and consider that if you want to play a game, Galaxy's resource hogging will take resources away from that game. You can have 8gb, but if you have a background process constantly getting cycles that uses 6gb, you only have 2gb left for the OS and the like. Amazing what people gain simply from closing their browser.
avatar
MarkoH01: You are running a windows OS on 2GB? Wow! It must take ages until the PC has even finished booting (once witnessed this on a 4GB PC with WinXP and it was no fun).

@everybody who chimed in to tell me that my assumption was wrong - my apologies. Consider me surprised, I honestly did not expect that so many are still running with 4GB or even less.
XP should run well and quickly on 2Gb as already stated. I'm using Windows 10 v.1909 on a 2010 PC with a 2010 graphics card and 4Gb RAM. Naturally I only run old games! Also, this PC is offline for security, freedom from interruptions and the horrible Windows feature updates; the 2020 updates were particularly bad. On an online PC and laptop, they were slowed down and a problem with the LAN driver arose. The updates also took up a lot of HDD space, even after doing a Disk Clean.

So offline installers of any kind are important to me. I've built up a library of apps and games for PC, and also for my rooted Android tablet. The offline PC is internet-ready though, but anti-malware apps are out of date, and it's a nuisance moving the PC and wiring it in to the router. A WIFI dongle might be an idea.
Post edited February 21, 2021 by eando52
low rated
Honestly comparing memory usage on most modern OSes, especially when comparing to older ones, is an exercise in futility.

A lot of programs running on Windows nowadays uses .Net or other similar high level programing framework. Those framework uses garbage collectors that are usually configured very conservatively to only release memory when a certain dynamic threshold is reached or when it is actually needed.

So even if in the task manager you have a program taking 2 GB it doesn't necessarily means that it is badly programmed, or that it really needs those 2 GB it can simply mean that the GC consider that it wasn't yet needed to free this memory as nobody else was needing it.

It's no longer like in the "old days" where the majority or programs where written in non-managed language and you really had a very tight control over the memory consumption.
avatar
kohlrak: IIRC, windows 32bit (most likely what you'll see with windows xp) is capped at 2GB, because of absolute pointers (windows might not want to manage paging) and a strong and reasonable desire to avoid utilizing the sign bit (thus making 32bit windows 31 bit instead). This, in turn, would limit it to 2GB.
avatar
Cavalary: 2 Gb per application, not total, and I think there's some optional patch or tool that removes that limit. The maximum usable is some 3.5 Gb, though that can be lowered by graphics.
Yeah, the patch enables some PAE stuff, which, iirc, came with 64bit CPUs. Address bus for 64bit CPUs is 40 bits, making the new limit 1099511627776 bytes (sign bit not a factor this time) which takes us to about 1TB max.
avatar
Gersen: Honestly comparing memory usage on most modern OSes, especially when comparing to older ones, is an exercise in futility.

A lot of programs running on Windows nowadays uses .Net or other similar high level programing framework. Those framework uses garbage collectors that are usually configured very conservatively to only release memory when a certain dynamic threshold is reached or when it is actually needed.

So even if in the task manager you have a program taking 2 GB it doesn't necessarily means that it is badly programmed, or that it really needs those 2 GB it can simply mean that the GC consider that it wasn't yet needed to free this memory as nobody else was needing it.

It's no longer like in the "old days" where the majority or programs where written in non-managed language and you really had a very tight control over the memory consumption.
And what triggers these memory managers? I've noticed they tend to only kick in when paging occurs. It's for this reason i can't use even old versions of firefox. they'll use up the entire RAM before they'll let go. This can be faster in an environment where you have enough ram to cover all tasks because you're not taking the time to constantly move things around. The old mallloc/realloc/reallocarray method required that if you wanted to shrink the RAM or grow it, you had to risk that you'd have to actually move everything if there was something too close after it. This could cause slowdowns. The tradeoff is then to add more RAM to gain a benefit of speed (since copying takes alot of time, especially if you got 2 consecutive memory spaces like you might with a browser that are constantly changing the size). On the flip side, you loose this RAM until you trigger all the memory managers to shrink due to lack of RAM. The thing is, for computers without alot of RAM, this ends up being reaaly, really slow. For computers with more RAM than they need, it's far faster than the old method. Whether it's more or less efficient depends entirely on whether you view RAM cycle counts a bottleneck or ram capacity a bottleneck. Personally, i like the old way better, as OSes themselves have been optimized for the old scheme, not this new memory management scheme. Moreover, well, it's easier for the OS to manage it with fake segments than the memory managers. Honestly, they step way out of bounds by trying to do the OS' job.

EDIT: Also, i've seen a few of them leak, too. The whole point of memory managers was to prevent leaks since, supposedly, programmers consistently would forget to free prior to initializing a new array, but, honestly, this can be effectively mitigated. Ironically, these memory managers often end up forgetting to free memory or reusing old memory instead of declaring new memory. One of the greater examples of this is java, where the java edition of minecraft, despite constantly unloading chunks, manages to grow and grow and grow, which shows the java memory manager tends to leak (for reasons i don't know, because i've learned to stay away from that mess of al anguage when i can.
Post edited February 21, 2021 by kohlrak
avatar
Gersen: Honestly comparing memory usage on most modern OSes, especially when comparing to older ones, is an exercise in futility.

A lot of programs running on Windows nowadays uses .Net or other similar high level programing framework. Those framework uses garbage collectors that are usually configured very conservatively to only release memory when a certain dynamic threshold is reached or when it is actually needed.

So even if in the task manager you have a program taking 2 GB it doesn't necessarily means that it is badly programmed, or that it really needs those 2 GB it can simply mean that the GC consider that it wasn't yet needed to free this memory as nobody else was needing it.

It's no longer like in the "old days" where the majority or programs where written in non-managed language and you really had a very tight control over the memory consumption.
Well it can be a bit like the clues in a detective novel, you gotta weigh up several aspects. At the end of the day though, it is all about performance, and if that is taking a hit you know there is a problem, and you start with the most obvious culprit(s). Being a bit of a programmer myself, I know a bit about optimization and I know about bloat, and frankly coders showing off, trying to be the next clever coder on the block ... many of them get too clever, and forget the real important stuff.

It's like when you buy a mobile phone, and it seems like actually using it as a phone was given the least focus, so what should just work and be simple, is mired in someone trying to be clever with fancy new features and mind-reading.

Many people rave about modern coding, but in reality, much of it is trickery, clever new ways of getting around things, and they often don't work so good, because they are often reliant on several other aspects all lining up neatly, including hardware. And frankly there is a lot of cutting corners all over the shop, usually to make things quick and simple.

The problem in the software and hardware world, is that it all ages way too quickly, so the focus is on spending the least amount of time on something, so that you can move onto the next progressive thing and not fall too far behind. You certainly see that in games, where development has taken too long, and a game often gets canned for being out-of-date. The things is though, that doesn't always equate to a loss of sales, especially if the real important elements of the game have been well done.

So my own personal view, and I see no reason to question it, is do a good job, focus on quality and optimization. Do that and sure you might not make deadlines and be the next great thing, but you won't make a loss either, and certainly as has been proved many times, you can still make a killing.

There needs to be more focus on the longer term, more important objective(s).
Post edited February 21, 2021 by Timboli
high rated
avatar
SmollestLight: Unfortunately, some of our systems are currently not fully operational which results in offline installer delays. Thanks for your patience and understanding.
avatar
MarkoH01: Thank you very much for this statement. Fingers crossed that you will be able to fix this soon.
Bringing this back just to let you know the issue isn't completely resolved just yet. However, things are slowly back on track and our team is doing their best to catch up. We will let you know once it's fixed. =)
avatar
MarkoH01: Thank you very much for this statement. Fingers crossed that you will be able to fix this soon.
avatar
SmollestLight: Bringing this back just to let you know the issue isn't completely resolved just yet. However, things are slowly back on track and our team is doing their best to catch up. We will let you know once it's fixed. =)
Hello, I don't suppose I could take the opportunity to please ask if this might include updating the installers of the game "Wolfenstein II: The New Colossus" for its Japanese language advertised on the store page?

I used a tool to unpack the provided installers to examine, and it seems the relevant files are totally missing.

My support ticket from over 2 months ago (when the game released here and I encountered the problem, detailed how I tried everything but have been unable to find how to install Japanese language support) still never was replied yet.

Edit: My support ticket just got the first reply! Not solved yet, but I am glad to hear now GOG is aware of the problem and looking to fix it.
Post edited February 23, 2021 by emme
avatar
MarkoH01: Thank you very much for this statement. Fingers crossed that you will be able to fix this soon.
avatar
SmollestLight: Bringing this back just to let you know the issue isn't completely resolved just yet. However, things are slowly back on track and our team is doing their best to catch up. We will let you know once it's fixed. =)
It was at least noticeable that things improved and more and more offline installers were updated so thank you for this. Also thank you for telling us about the actual situation. It is really, really appreciated.
avatar
SmollestLight: Bringing this back just to let you know the issue isn't completely resolved just yet. However, things are slowly back on track and our team is doing their best to catch up. We will let you know once it's fixed. =)
Thank you for the status report. Very appreciated!
avatar
MarkoH01: Thank you very much for this statement. Fingers crossed that you will be able to fix this soon.
avatar
SmollestLight: Bringing this back just to let you know the issue isn't completely resolved just yet. However, things are slowly back on track and our team is doing their best to catch up. We will let you know once it's fixed. =)
Thanks for the heads up!
avatar
MarkoH01: Thank you very much for this statement. Fingers crossed that you will be able to fix this soon.
avatar
SmollestLight: Bringing this back just to let you know the issue isn't completely resolved just yet. However, things are slowly back on track and our team is doing their best to catch up. We will let you know once it's fixed. =)
Ok, this is the level of transparency I want to see from gog. This is what brings confidence: actual progress reports. I don't know if you can give feedback or not, but once things are not so unstable, perhaps you could redirect the following to the galaxy team, as it could end up saving GOG alot of money if they can sell the idea to investors. If not, i don't expect to be able to just drop suggestions, but if you do see this i think you'll see where it's palatable.

The industry (companies like Google, Microsoft, IBM, etc) is moving towards and updating model called "delta updates." The idea is to save bandwidth and storage space by making installtion scripts rather than direct exe installers. The actual installer is just one program (called a "package manager") that can read the installer script (called a "package"). There are "delta packages" as well that contain short changes to the original script that are much, much smaller in size, and are designed to become integrated into the original installer script. If GOG were to create it's own package manager abandoning the Inno setup, as well as making Galaxy use the package manager instead of the "Steam method" (which is smaller than a pure installer, but still larger than delta updates which can contain data for a small change in a very large file), then GOG could create standalone installers with the same system as the galaxy installers without forcing users to use Galaxy, but also improving galaxy. If done correctly, and the packages are cached on users' systems, this would allow downgrading versions as necesssary, as well as games sharing things like the redistributable direct X packages without constantly having to download them every time (you would mark the direct x packages as "dependencies"). In short, GOG would improve storage space, bandwidth, and automation, while also having competitive soltuions that it's own competitors aren't using yet (but are likely developing).
avatar
MarkoH01: Thank you very much for this statement. Fingers crossed that you will be able to fix this soon.
avatar
SmollestLight: Bringing this back just to let you know the issue isn't completely resolved just yet. However, things are slowly back on track and our team is doing their best to catch up. We will let you know once it's fixed. =)
So does this mean that the numerous out of date GOG offline installers across the site will be updated?

I'm sceptical if this is the case as this has been a issue with GOG for years more so since Galaxy.