dtgreene: Note that the situation where compiling uses up large amounts of RAM comes up when using Gentoo, as certain packages (like Chromium and LibreOffice) are, in fact, huge C++ projects.
(Note that any other source-based distro that offers these packages has the same issue.)
sanscript: Of course, everything is relative, and poorly optimized code and bug within the compiler itself is the worst offensive here, but from what I can gather, these aren't the type of huge projects I was thinking of. C and python seem to only use a few MB per 100.000-200.000 lines which is excellent for embedded archs like Arduino and RPI.
What I can read is that even 12 is more than enough for big projects, but someone that is far more experienced in different types should answer that more definitively with numbers. Perhaps one saves only 5 seconds with more RAM than 12-16... idk.
One thing to consider is how many compilers you run in parallel. A couple years ago, 8G was just barely enough for compiling Firefox on a dual-core (4 thread) laptop, and I had to make sure I don't have many browser tabs open when I'm compiling. I would assume that's not enough if you want to exploit the parallelism of a 8-core 16t CPU. And why would you buy such a CPU if you can't make good use of it..
I'm still using that laptop but I have to watch RAM usage all the time and start closing things before it hits the limit. Getting close to OOM is a daily concern.
I have 16G on my work laptop which I use mainly for coding & compiling, sometimes VMs, with some browsing. I run out of RAM every now and then; I've had to kill things like backups in the past to prevent OOM during a compile.
My desktop (R7 1800X) had 32GB of RAM and I did run out of RAM in similar usage a few times, so I upgraded to 64G now (and am waiting for a 12-core CPU, which should arrive very soon now).