nulian: No crashes at all yet and reached year 3 on the pc version.
Fully upgraded pc with up to date drivers with a real graphics card.
"Real graphics card"... well, good for you. I guess you're one of those guys who think every single one of us with GPUs older than 7 years are not real gamers, and whatnot.
Well, I couldn't care less about what you and most of the elitist PC "MASTER RACE" gamers say and think. Everyone should be entitled to play the games they love, despite their system requirements. This is not to say I don't understand video gaming is directly dependent on technology that is ever-evolving, but Grim Fandango Remastered doesn't look like The Witcher III, for instance. I look at The Witcher III and I immediately know that my computer definitely doesn't run it, but by looking at videos of Grim Fandango Remastered, there's no reason whatsoever to think it would need such high PC specs in order to run properly.
The Double Fine team was just lazy on this one, they probably did nothing in terms of optimizing the game when porting it to the PC. With the similar architecture that PCs and consoles have in this generation, they probably just ported what the Sony team did for their PlayStation 4 and Vita versions, which obviously come with OpenGL 3.3 and other things like that right off the bat. By looking at the game running, there's no reason to think it wouldn't run on Windows 2000 with a really old GeForce2. So why do only "real" gamers have access to Grim Fandango?! From what I've been experiencing in twitch streams, "real" gamers (i.e. graphic whores) don't even seem to like Grim Fandango, they say it looks ugly and old and many of them never even heard about the game. So, if "real" gamers, the kind of gamers that think a GPU older than 3 years is ancient are not the target audience for this game, why didn't they lower the specs for the PC version, along with some much needed optimization and stabilizing, so the people that *actually* want to play the game could do so?
On a side note, that end of year one cinematic is bugged even if you play it on a computer with a "real graphics card", I think that you either got very, very lucky and didn't get the bug, or you're plainly lying.