Posted May 08, 2017

Lin545
May. 24, 2022
Registered: Jun 2011
From Russian Federation

MadalinStroe
Veni, vidi, vici
Registered: Apr 2012
From Romania

Lin545
May. 24, 2022
Registered: Jun 2011
From Russian Federation
Posted May 08, 2017
8 GB DRAM-based SSD sold used for $1500 in 2015. However, these did not take off like they expected - probably too high production cost (popularity proof), so they stayed as cache/buffer disks (also very good for swap). However, this thing may, if it can achieve claimed 1000x higher durability then it may arguably be used for swap without much looking back. I hope I didn't break any NDAs...? Thoughts?

Themken
Old user
Registered: Nov 2011
From Other
Posted May 08, 2017
Intel already announced the Optanes and there are some reviews of them out there as well.
There are these small ones as well: https://www.scan.co.uk/shop/pro-gaming/storage-drives/intel-optane-memory-system-accelerators
I know one used to be able to turn swap file completely off in Windows (do not do without plenty of memory!) but I do not know if this is still the case. No info about setting swappiness level in Windows, like you can do in Linux but if they are smart they have it at least in their Windows versions not for home users.
There are these small ones as well: https://www.scan.co.uk/shop/pro-gaming/storage-drives/intel-optane-memory-system-accelerators
I know one used to be able to turn swap file completely off in Windows (do not do without plenty of memory!) but I do not know if this is still the case. No info about setting swappiness level in Windows, like you can do in Linux but if they are smart they have it at least in their Windows versions not for home users.
Post edited May 08, 2017 by Themken

OlivawR
New User
Registered: Dec 2013
From Italy
Posted May 08, 2017
My experience with swap/pagefile made me realize that I do not need them anymore. I haven't seen them being in use with more than few MB so I just disabled them (no pagefile on Windows and no swap partition on Linux [don't get fooled by the size of your pagefile/swap partition; just because it's 4 GB it doesn't mean your system is using all of it] ).

skeletonbow
Galaxy 3 when?
Registered: Dec 2009
From Canada
Posted May 08, 2017
Keep in mind that the lifespan that hard disks, SSD drives and most other consumer electronic items are rated for is not a hard date upon which everything dies instantly when the date passes, and everything that hasn't reached the date works like brand new. It is the MTBF, or "mean time between failure" which is a statistical reference. This means they test the drives or electronics etc. in a lab test centre and perform various forms of wear and tear on them etc. to induce failure on a large enough sample of hardware and see how much abuse they can take before they die. Normally that is an accelerated testing procedure because they can't design an SSD for example and test 1000 of the given model for 10 actual years to then be able to say "they last for 10 years, yup!". They'll perform a multitude of reads/writes to the things and find out how much they can handle on average before exploding and blowing chunks, then they look at the actual read/write rates typical of consumer and/or enterprise usage behaviour with various applications/games etc. to estimate how long it will take such ordinary usage to reach the number of writes that the hardware fails at in the lab.
I'm completely making up the following numbers just for illustrative purposes. Lets say 20 drives die roughly after 40000 operations, another 20 die after 50000 operations and another 20 die after 60000 operations. They find the average lifespan overall of these then is 50000, which means some drives will die before that, others will last longer than that and some will last exactly that. Then in actual usage by a specific consumer, the way they might actually use that drive might take them only 3 years to reach 50000 and thus mathematically they are much more likely to experience a drive failure, while someone who uses the computer in a more light weight manner with software that does less writes to the hardware might end up taking 10 or 12 years to generate 30,000 operations.
The drives are not spec'd out specifically in years, but in operations that the hardware is designed to handle over some amount of time, and since the actual amount of time that will take to kill a drive depends on how the drive is actually used and how often it is used, the actual death of a single specific drive can vary dramatically from a couple of years to over a decade depending on how it is used. Exposure to mishandling, harsh environments, bad electrical power or other factors could additionally impact the life span of drives as well.
So, the life expectancy isn't bullshit, it is a mathematical calculation determined by experimentation and averaging estimates of how people use them in general to get a well rounded general estimate with which to then base a warranty on in terms of years of service with which they are confident the majority of the hardware will live that long based on the math. Some drives will die before that and they likely expect that to happen, others may last longer.
It's the nature of just about everything sold in stores that contain electronics and/or electro-mechanical parts though.
I'm completely making up the following numbers just for illustrative purposes. Lets say 20 drives die roughly after 40000 operations, another 20 die after 50000 operations and another 20 die after 60000 operations. They find the average lifespan overall of these then is 50000, which means some drives will die before that, others will last longer than that and some will last exactly that. Then in actual usage by a specific consumer, the way they might actually use that drive might take them only 3 years to reach 50000 and thus mathematically they are much more likely to experience a drive failure, while someone who uses the computer in a more light weight manner with software that does less writes to the hardware might end up taking 10 or 12 years to generate 30,000 operations.
The drives are not spec'd out specifically in years, but in operations that the hardware is designed to handle over some amount of time, and since the actual amount of time that will take to kill a drive depends on how the drive is actually used and how often it is used, the actual death of a single specific drive can vary dramatically from a couple of years to over a decade depending on how it is used. Exposure to mishandling, harsh environments, bad electrical power or other factors could additionally impact the life span of drives as well.
So, the life expectancy isn't bullshit, it is a mathematical calculation determined by experimentation and averaging estimates of how people use them in general to get a well rounded general estimate with which to then base a warranty on in terms of years of service with which they are confident the majority of the hardware will live that long based on the math. Some drives will die before that and they likely expect that to happen, others may last longer.
It's the nature of just about everything sold in stores that contain electronics and/or electro-mechanical parts though.

Lin545
May. 24, 2022
Registered: Jun 2011
From Russian Federation
Posted May 08, 2017

I am surprised they have not made SSD yet and sell such low volumes at ... affordable prices. Probably they are either "public beta testing" those, or the memory is not reliable. Or they want to attack AMD (via the war of "premiums") instead of Samsung & co...
Post edited May 08, 2017 by Lin545

Cyker
New User
Registered: Mar 2011
From United Kingdom
Posted May 08, 2017
Well Flash does have a relatively low endurance compared to almost every other type of storage media.
However, SSDs have a lot of mitigation strategies in place to make the impact less noticeable - buffering and batching, sector relocation etc.
This has some downsides as seen by earlier SSDs (Lose power at bad time, all SSD data lost!), but for general consumer use they will last pretty much as long as a regular HDD. (The difference being the SSD will die of write exhaustion whereas the HDD will usually die due to the logic board getting fried or, more commonly, the motor just burns out)
Flash is pretty shitty stuff for heavy writes tho' - There is a whole debate in the car dashcam world, where we find microsd cards dying after weeks or months due to the constant overwriting.
To be fair, a dashcam is a worst-case scenario for an SD card - Constant full linear writes and minimal reads, which is the complete opposite of what flash is good at (i.e. infrequent writes, massive random reads). Still, it is pretty disgraceful that all flash manufacturers will automatically void the warranty of a flashcard if it's used in a dashcam, as it shows they know how bad the write endurance in them is.
Several sdcard manufacturers, most recently sandisk, have been marketing high endurance sdcards which are specifically MLC instead of TLC (The write endurance of TLC is awful compared to MLC). However, even these are not great - a 64GB high endurance microsdcard is rated at a pretty pathetic 2 years of use.
So the low life is not a lie exactly, it's just that we've learned to engineer around it!
Personally I'm hoping for things like memristors, MRAM, PSRAM and other such non-volatile next-gen memory systems to pick up. We might even see a resurgence of OS like PalmOS which run directly from RAM instead of the boring and slow load-store-execute architecture we still use!
However, SSDs have a lot of mitigation strategies in place to make the impact less noticeable - buffering and batching, sector relocation etc.
This has some downsides as seen by earlier SSDs (Lose power at bad time, all SSD data lost!), but for general consumer use they will last pretty much as long as a regular HDD. (The difference being the SSD will die of write exhaustion whereas the HDD will usually die due to the logic board getting fried or, more commonly, the motor just burns out)
Flash is pretty shitty stuff for heavy writes tho' - There is a whole debate in the car dashcam world, where we find microsd cards dying after weeks or months due to the constant overwriting.
To be fair, a dashcam is a worst-case scenario for an SD card - Constant full linear writes and minimal reads, which is the complete opposite of what flash is good at (i.e. infrequent writes, massive random reads). Still, it is pretty disgraceful that all flash manufacturers will automatically void the warranty of a flashcard if it's used in a dashcam, as it shows they know how bad the write endurance in them is.
Several sdcard manufacturers, most recently sandisk, have been marketing high endurance sdcards which are specifically MLC instead of TLC (The write endurance of TLC is awful compared to MLC). However, even these are not great - a 64GB high endurance microsdcard is rated at a pretty pathetic 2 years of use.
So the low life is not a lie exactly, it's just that we've learned to engineer around it!
Personally I'm hoping for things like memristors, MRAM, PSRAM and other such non-volatile next-gen memory systems to pick up. We might even see a resurgence of OS like PalmOS which run directly from RAM instead of the boring and slow load-store-execute architecture we still use!

F4LL0UT
Get Showgunners!
Registered: Jun 2011
From Poland
Posted May 23, 2017

Also keep in mind that the article is from 2009. Since then Microsoft has only gotten better at handling SSDs.
What's funny, though, after the initial scare I did some math and noticed that even using it for the pagefile the drive's "guaranteed" 300 TB would only get used up in more than three years at that rate. That's still too fast to go on using the SSD for the pagefile (at last for my current standards and at least until I get more RAM) but it's actually really reassuring.
That said: damn, putting the pagefile on the SSD did speed things up a damn lot.

MadalinStroe
Veni, vidi, vici
Registered: Apr 2012
From Romania
Posted May 23, 2017
May I ask how much RAM you have? Because if it's less than the recommended values for the software that you ran, then it's natural that it would cause a lot of caching.
However, I still believe that, for normal everyday use, 16 GB RAM and a SSD with the pagefile on it, will ensure the best Windows experience, while also not negatively impacting the SSD. 32 GB is my dream but it will take me some time.
However, I still believe that, for normal everyday use, 16 GB RAM and a SSD with the pagefile on it, will ensure the best Windows experience, while also not negatively impacting the SSD. 32 GB is my dream but it will take me some time.
Post edited May 23, 2017 by MadalinStroe

kbnrylaec
Asuka Tanaka
Registered: Nov 2011
From Taiwan
Posted May 23, 2017
You can simply turn off pagefile if you have 16 GB RAM. :-P

Trilarion
New User
Registered: Jul 2010
From Germany
Posted May 23, 2017

It may depend on usage or manufacturing process.
I think I remember there was a tool being able to readout hard discs and their failure management and the more failures you had the closer your hard disc was towards being dead.
I could imagine now that without all the mechanical parts of a HDD, a SSD could live even longer.
On the other hand there is also endurance of the data as an important feature. So, if I archive my GOG games on an SSD or HDD, will the data still be there if I don't touch the discs for many years?

JAAHAS
100% Steamless
Registered: Sep 2008
From Finland
Posted May 23, 2017


A small RAMdisk that can be loaded on the start before Windows reads/creates the pagefile seems to be the best solution, although that will in turn break the hibernation function.

F4LL0UT
Get Showgunners!
Registered: Jun 2011
From Poland
Posted May 23, 2017
I'm aware of that, as I said, I will reconsider putting the pagefile on the SSD again once I get more RAM. Sadly I only have 8 GB at this point which is simply not enough for the applications I'm using, at least with projects of that size.

F4LL0UT
Get Showgunners!
Registered: Jun 2011
From Poland
Posted May 23, 2017
Turning it off completely is simply not a good idea. Disabling it when you presume that your RAM will completely suffice means that if you're right there's no gain but you're at constant risk of having your applications crash the first moment you're out of RAM. And if 16 GB suffice REALLY depends on what sofware you're using and how.
Case in point, at work both a colleague and I recently ran out of space on the drives that held the pagefiles. Our machines have 16 GB of RAM but we had constant crashes because it simply wasn't enough and Windows wasn't able to extend the pagefile as necessary.
Case in point, at work both a colleague and I recently ran out of space on the drives that held the pagefiles. Our machines have 16 GB of RAM but we had constant crashes because it simply wasn't enough and Windows wasn't able to extend the pagefile as necessary.