teceem: Does anyone feel that playing (for example) The Witcher 3 at 144 fps / Hz is a better experience than, let's say; 45 fps, or even 60 fps?
It's a very subjective thing that differs from one game and one person to the next. As others have said above 30fps vs 60fps in a point & click adventure or old turn based RTS with fixed animation rates is different from an FPS. Some game engines, eg, Adventure Game Studio are locked to 40fps with most people not really noticing it vs the style of play.
Personally, I can see the difference between 144fps vs 60fps but it's far less of a jump compared to 60fps vs 30fps, and I'd rather aim for a stable, fluid, fixed 60fps than a "144fps experience" that constantly fluctuates between 60-150fps. The higher you go, the harder it is to maintain a fixed min frame rate unless you only buy top-end hardware (for which the newest games will lower again often faster than they have any visual improvements to justify it), and I'm long done with that rat-race.
teceem: I know that there are a few 240 Hz gaming monitors out there. I can't really experience it without buying one - but I just can't imagine that with the kind of games I play and how I play them - that I'll ever notice much of a difference.
That stuff is partly aimed at super-competitive MP shooters like Fortnite, Overwatch, etc, to minimise latency and part "gamer" marketing, ie, "buy our super-leet 240Hz monitor over their 144Hz monitor" for the same reason you 'need' a $499 "Gamer Chair" and 500x RGB LED's in your case. That makes you a "Real Gamer (tm"). ;-)
teceem: I've read these "scientific" articles too... often explaining why movies (film in general) are 25 or 30 fps. Have you ever seen a movie in 60 fps? I definitely see the difference! But no, I don't prefer it - it might be more realistic but (to me) it looks more like a "home video".
A lot of that is habituation, ie, people are simply used to seeing 24fps for decades then "surprised" for the first time. If 60fps movies become standard, the opposite will happen and people will become habituated to that and see 24fps as either stuttery (or better notice the blurred motion that comes with it). Same is true of games, 240Hz vs 144Hz may be "hardcore shooter" territory, but once you're used to 60fps, it's hard to go back to 30fps even in non-action centric games. Eg, even isometric RPG's like Dragon Age / Divinity Original Sin look much smoother when moving / panning the camera across the screen.
teceem: Wasn't THE classic fps Doom (2003) capped at 35 fps? I've never heard anyone saying that it was laggy.
Original Doom (1993 for MS-DOS) has a 35fps cap but as others mentioned it also lacked mouse-look which hid a lot of the problems. GZDoom removes this cap, better interpolate monster movement and moving / turning / aiming with a mouse very definitely plays much better overall at 60fps.
Really you can test 30 vs 60fps in a lot of your own games yourself by capping fps to 30 then swapping back to 60. You can either do this in your GPU driver (eg, for nVidia, there's an "adaptive (half refresh rate)" option that limits it to half of your monitors option. Alternatively, utilities like
MSI Afterburner have frame-rate limiters built in.