It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
Its insane how far things have come.

Im particularly shocked at BG3 having over 100 gb
Assassin's Creed Valhalla? Total War Warhammer 3?!

Honestly i find the DRM and Denuvo malware in those games far more disturbing than the actual size!
Post edited August 08, 2023 by 00063
avatar
Ancient-Red-Dragon: I have no idea what the OP's point of this thread is supposed to be.

What is he saying, that games should never increase in size as technology advances?

Sure they should.

And games have been over 100 GB for many years already. This is nothing "new."

That's always been how games work, as time advances, so does the necessary storage size because games became bigger and with better graphics.

Nothing is wrong with that.

What's the alternative that the OP is suggesting? For graphics in video games permanently to stagnate and never improve? Because that's exactly what would happen if game size could never increase over time. And it wouldn't be worth it to impose such a size limitation.
It would be something that if suddenly this direction of ever-increasing data limits would halt and turn around. Wasn't Isaac Asimov the on that 'predicted' machines only growing bigger and bigger. Maybe he was right after-all though not spot on. Even if there would be some weird kind of turn'around i suppose the data stream will only grow larger. Or, will it eventually turn out to some sort of AI and a certain set of parameters.... i wonder....
avatar
Zimerius: It would be something that if suddenly this direction of ever-increasing data limits would halt and turn around.
There are two ways you can do this:

1. Invent a compression technique that has zero impact on uncompressed quality, compresses to a very small size and can be decompressed on the fly with minimal processor overhead

2. Develop a very small AI that sits in a couple of megabytes on your hard drive and that can generate consistent, high quality textures on the fly that seamlessly fit into a game world

Both of these are pretty much impossible with current technology.

The third option is to just live with increasing file sizes - storage is getting cheaper and internet speeds are getting faster all the time.
Largest kit I downloaded is 35 GB (Witcher 3, the old version). Have one that's larger in library but no room for it. 1 TB total space on computer, split between 3 drives and with an automated daily backup of certain parts, the copy taking up 50 GB at the moment.
I keep thinking of any even over 10 GB as big, over 20 definitely so. At 50 I'd think fuck that (don't have any of those yet, new version of Witcher 3 notwithstanding). And definitely no use for 4K+ or ultra detail textures, or for voice packs in other languages, so sure wish I won't need to download them as part of the installers.

(And yeah, you definitely mean bytes, 100 gigabits would be 12.5 gigabytes, and there'd be a whole lot of those by now. Bits are used for network bandwidth, bytes for size. You got the symbol right though, GB is gigabyte, Gb gigabit.
avatar
timppu: It would be more interesting to know what parts of the assets take the most space in those gigantic games. Ultra-high res textures? Voice acting? At least it shouldn't be the FMV parts as most games nowadays use the game engine for the non-interactive parts, right?

As games just seem to be getting bigger and bigger, does it mean each year they will have even higher res textures and more hours of voice acting, or where does the continuously increasing size come from?
I know that Call of Duty infamously has uncompressed wave sounds for their pornographically detailed guns. Another area is overdetailing things nobody gives a damn about such as seeing the pores on Skull Face's face or concrete texture #145 (a 2048 x 2048 texture). Or rendering on the fly for camera flaws such as chromatic aberrations, lens flares, and other signs your camera sucks, that no human eye should see.

Or for another example, setting up for Dolby Atmos when most of us are using a 2.0/2.1 setup at best. (And they want you to have 24.1.10). Where you have 128 audio tracks with special metadata for something which quite frankly, most people don't give a shit about.

Unless the voice lines were being generated on the fly, they'd still be prone to taking more size they reasonably should because most games have about 6 hours of audio as a randomly generated baseline. Even if you were to strip it down to strictly necessary audio via something like a vorbis codec, that's still a lot of audio, no matter how you slice it.

Another source of bloat is installing all the languages, rather than letting the user select a language suited to their preference (or even none at all.) Want someone shouting "GOAL" in 76 languages, including Esperanto, Klingonese, and Simlish? Too bad, now you have to!
Post edited August 07, 2023 by Darvond
Actually, 100 GB BD discs haven been around for a very good while (since the PS3 era which is 15+ years ago°°)... yet, many users still consider it huge, but it was basically a disc standard from very long ago already.

Nowadays... not so long ago... the console manufacturers simply was now "catching up" and the PC market got dragged-along into this new spec as well... on a regular basis (previously it was pretty rare).

Anyway... SSD drives finally are coming down in prices... 2 TB for less than 100 USD... what do you ask for? And HDD prices are steady but the difference is... around 10 years ago you had to pay 300 for a 10 TB drive and now you will get twice the space for the same price.

No need to feel offended... time is moving on and the drives are increasing in size too.

Of course it is never cheap having some private datacenter-management because so many drives involved, but this is another scale which is not a requirement for gaming.

°°Of course the PS3 was using 50 GB discs, yet there was already 100 GB disc available but the drives known as BDXL was simply to expensive for Sony and the industry simply was not ready for it (it would have been a overkill situation). Nowadays the matter has clearly changed.

And nope, it does not really make sense buying a small HDD, unless you get it almost for free and are gonna build a large NAS array... but usually the huge HDDs got a better coins per GB/TB ratio and this is the spec that counts.

avatar
Darvond: I know that Call of Duty infamously has uncompressed wave sounds for their pornographically detailed guns. Another area is overdetailing things nobody gives a damn about such as seeing the pores on Skull Face's face or concrete texture #145 (a 2048 x 2048 texture). Or rendering on the fly for camera flaws such as chromatic aberrations, lens flares, and other signs your camera sucks, that no human eye should see.
There are human eyes enjoying all those incredible details, thus even owning a 4k screen and maybe even OLED. It may not be understood by some old school gamers but everything got its place and there is plenty of other games with low details and low data size. If you do not believe me... just look around, it is actually the majority of all games, still.

However, i you want to play a "franchise-blockbuster"... it may be full of non required details and much data size, because this is the stuff a "top notch" gamer using a modern system simply is expecting.

Generally, i am NOT AGAINST something else, because to me everything got its place and what to choce or enjoy very much depends on the individuals; so its great we have all those options available.
Post edited August 07, 2023 by Xeshra
avatar
Darvond: Unless the voice lines were being generated on the fly, they'd still be prone to taking more size they reasonably should because most games have about 6 hours of audio as a randomly generated baseline. Even if you were to strip it down to strictly necessary audio via something like a vorbis codec, that's still a lot of audio, no matter how you slice it.
64kbps opus is plenty for speech; the vast majority of people would not be able to ABX, and even when you could, none of the compression artefacts at that level are bothersome. With that it'd be about 170 megabytes for 6 hours of voice.
you just need to use better compression algorithms
avatar
clarry: 64kbps opus is plenty for speech; the vast majority of people would not be able to ABX, and even when you could, none of the compression artefacts at that level are bothersome. With that it'd be about 170 megabytes for 6 hours of voice.
Sure. Now multiply that for each language you bundle. Audio post-processing, and more unwarranted things, so...it quickly adds up.
avatar
Cavalary: (And yeah, you definitely mean bytes, 100 gigabits would be 12.5 gigabytes, and there'd be a whole lot of those by now. Bits are used for network bandwidth, bytes for size. You got the symbol right though, GB is gigabyte, Gb gigabit.
ponders.... or had it something to do with HD's sold... i'm not sure anymore. I literally have a memory where i think someone explained to me how from a sales perspective figures where cut decimal, for ease of mind mostly, and that this was called bit instead of byte.. must be remembering this wrong
avatar
Xeshra: Actually, 100 GB BD discs haven been around for a very good while (since the PS3 era which is 15+ years ago°°)... yet, many users still consider it huge, but it was basically a disc standard from very long ago already.

Nowadays... not so long ago... the console manufacturers simply was now "catching up" and the PC market got dragged-along into this new spec as well... on a regular basis (previously it was pretty rare).

Anyway... SSD drives finally are coming down in prices... 2 TB for less than 100 USD... what do you ask for? And HDD prices are steady but the difference is... around 10 years ago you had to pay 300 for a 10 TB drive and now you will get twice the space for the same price.

No need to feel offended... time is moving on and the drives are increasing in size too.
I have to say, just happened upon the website from samsung, the 2 tb version of the 970evo nvme page. 600 euro's!!
avatar
Orkhepaj: you just need to use better compression algorithms
I have a book lying around somewhere, release - around 2000. It is a investigation done by a reporter which details the mysteries surrounding the sudden disappearance of a scientist who made it able to stream multiple windows and almost full screen, footage of soccer matches etc on a normal customer device for that time. Apparently he was in talks with some of the majors in the industry when he suddenly disappeared.... Aliens probably
Post edited August 07, 2023 by Zimerius
avatar
Zimerius: ponders.... or had it something to do with HD's sold
No, the 'b' vs 'B' was always Bit vs Byte.

The "hard disc problem" that you mean is Gigabyte vs Gibibyte.

1GB = 10^9 Bytes
1GiB = 2^30 Bytes

This has been established over 20 years ago, because there was always a big confusion about if "Giga" meant the Giga in the metric system or "Giga" in the binary system.

However, Microsoft ignored this to the present day.
The file explorer and all other dialogs use KibiByte, MebiByte and GibiBytes, but label them as KiloByte, MegaByte, and GigaByte.

I think Unix/Linux uses the correct units (KiB, MiB, GiB, TiB), but I am not sure.

This is the reason why clients keep complaining that their hard drives or SD cards don't have the full storage as advertised. But they are wrong. Apart from what's lost to the file system, the storage is correct.


And there are of course exceptions:
One of the most original mixes is the HD floppy:
1.44 MB = 1440 Kibibyte = 1440 * 1024 Byte.
The first part is metric, the second one binary.
The issue is, the system can not use a "metric" system because it is a binary system. Yet, the hard drives are always calculated using the metric system.

In a metric system 1 GB is exactly 1 million bytes. However, a computer is using a binary system so it have to use GiB which is 1 GiB = 1,024 million bytes. It takes more bytes for a GiB, so the drive will have lesser GiB inside the OS environment. What actually is seen inside the OS is GiB and not the GB a drive is usually sold at. So, GB is purely "marketing", for a computer there is no meaning.

This is simply the spec we usually buy drives at... so i can not say GiB all of a sudden, else the people will get confused.

Indeed, MS is confusing here because they write on the drives "GB", but actually they mean GiB.

avatar
Zimerius: ponders.... or had it something to do with HD's sold... i'm not sure anymore. I literally have a memory where i think someone explained to me how from a sales perspective figures where cut decimal, for ease of mind mostly, and that this was called bit instead of byte.. must be remembering this wrong
Manufactures simply want to "push numbers", as decimal will allow for bigger numbers "to the eyes". They could create a drive with 20 GiB for example and the confusion will stop because this is the amount the OS will use.

You could calculate everything using bits but it just makes no sense for data that are very huge. Even for networking i already start to use MB instead of Mb sometimes, it just makes the effective speed more visible to me in order to know "how fast my data will be transfered". So, if i use a 1 Gb/s connection for example it would mean i got roughly a bit more than 100 MB/s and around 1000 MB/s if i got 10 Gb/s. For marketing terms it may make sense using "the Gb/s" term, but it is not very practical for knowing your effective speed.
Post edited August 08, 2023 by Xeshra
avatar
Zimerius: After checking my library, i so far 'discovered' a couple of titles that exceed the 100+ Gigabit syze.
avatar
Timboli: Just for the record, you mean Gigabyte, not Gigabit. There are 8 bits in a byte.

So my web connection for instance, is 50 Gigabit, which equates to over 5 Megabytes a second ... that roughly being 50 divided by 8, but then also factoring in the usual losses ... so we never actually reach 6.25 Megabytes.
I usually do 10 bits to a byte when transmissions are involved, mostly there's CRC codes and TCP/IP overhead and the occasional lost packets that need re-transmitting.

Even when doing low level serial communication, you'd need a start bit, 8 bits data, then a parity bit followed by a stop bit, if memory serves right. 11 bits to send 8.

While using said Megabit/Gigabits make something sound bigger than it is (Genesis game with 5 MEGA POWER!), in media streams it helps to more precisely how large a bitstream is.

But yeah, best to not confuse which scale we're talking about.
avatar
Orkhepaj: you just need to use better compression algorithms
Better compression, using textures and models that can be generated, or my personal preference, having a low/mid download as the default, and HQ/Ultra as a separate package, bet the download sizes would be cut to 1/3rd or 1/4th the size.

Though doing compression on my own as a side hobby, finding a lot of places things could be trimmed when you have access to the data.
Post edited August 08, 2023 by rtcvb32
avatar
neumi5694: 1GB = 10^9 Bytes
1GiB = 2^30 Bytes

This has been established over 20 years ago, because there was always a big confusion about if "Giga" meant the Giga in the metric system or "Giga" in the binary system.
And those extra bits/bytes can make quite the difference in storage when added up over time.

avatar
neumi5694: I think Unix/Linux uses the correct units (KiB, MiB, GiB, TiB), but I am not sure.
Hard to say, when using dd util, i often get 2 outputs, which is a mix of both. It's hard to say what is what, though when you're specifying size by k/m/g it's the proper power of 2. Or so i've seen.

avatar
neumi5694: This is the reason why clients keep complaining that their hard drives or SD cards don't have the full storage as advertised. But they are wrong. Apart from what's lost to the file system, the storage is correct.

And there are of course exceptions:
One of the most original mixes is the HD floppy:
1.44 MB = 1440 Kibibyte = 1440 * 1024 Byte.
The first part is metric, the second one binary.
There's a second part and that's the filesystem. FAT12 used on floppy requires no less than 21 sectors to start with i believe. (boot sector, 2 copies of the FAT table at 9 sectors each, and a fixed root table, 2 sectors often with 63-64 files). This reduces the max storage you can do to something closer to 1.38Mb. In more recent filesystems you may get say a 1 Terabyte drive, but it only has 10^12 bytes, and then a portion of that is blocked off for the filesystem as well, probably getting you more like 970GB space.

Though in the past you could adjust the size of the sectors to get larger than normal floppies, add on transparent compression with DoubleSpace on Windows 95 and you could feel like you had 4-5Mb disks...

edit: 27 to 21 sectors, as Fat12 uses 12bit table vs 16bit entries, giving up to 4k of sectors
Post edited August 08, 2023 by rtcvb32