It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
StingingVelvet: I was really into movies for a while there and got a nice 4k set, but the problem with using it as a big monitor is the input lag. Even in "game mode" it's generally much worse than a PC monitor, which is especially obvious in games with mouse look in my experience.
Could be, as (as far as I've understood) TVs and computer monitors tend to use different display technologies.

Also on my 47" TV the lag was very noticeable (I could easily detect it when I had my laptop connected to the TV with HDMI, seeing the desktop on both displays, and then just scrolling some Notepad text document or a web page; the scrolling was far behind on the TV screen than what I saw on the laptop screen). I guess that is mainly because of all the picture post-processing stuff which naturally takes time: the video feed is first processed before shown on the screen.

However when I switched off absolutely every single post-processing and picture enhancement stuff on the TV, I couldn't see the lag anymore. I am sure there still is some compared to the fastest gaming monitors, but now it is good enough for me. Also since console gamers are playing games on TVs, I guess they expect it (the "gaming mode") to be relatively lag-free.

I recall reading though that the most common TV display technologies would still have an issue with picture burn-in, something that computer monitors don't have? So maybe having the same desktop screen or the static game interface on the TV screen could cause some burn-in... Movies and normal TV programs with their constantly changing picture doesn't cause any problem.
avatar
timppu: If and when I buy a 65" (or bigger) TV to replace our current 47" TV, it may well be I will just move the 47" TV to my "work room" and use it as a big-ass second (gaming) monitor there.
avatar
StingingVelvet: I was really into movies for a while there and got a nice 4k set, but the problem with using it as a big monitor is the input lag. Even in "game mode" it's generally much worse than a PC monitor, which is especially obvious in games with mouse look in my experience. Sometimes though when I'm playing something like Assassin's Creed I'll hook the PC up to the TV and use a controller.
Input latency was one of the main reasons I stayed away from TV's (non CRT ones anyway) for so long. Even most of my console gaming was done on a monitor and sometimes my projector. But last year I replaced my projector with a new 120Hz freesync TV for theatre stuff, hooked up my consoles in game mode and I seriously cannot pick any latency compared to my monitor. So it seems TV's have come a long way. I've never measured it, but my view is if I don't notice it then it's okay, and I can't even notice a difference playing a racing game when moving from monitor to my TV. The game modes differ hugely between brands and even models though, so trying before buying is ideal if possible. But one thing is clear, the picture quality of the TV is vastly superior, so much so that I'm thinking of moving my main PC to the TV and bringing my old Core 2 PC out of semi retirement to connect to the monitor for day to day hack stuff like browsing and retro gaming (which is most of my PC gaming these days anyway).
avatar
CMOT70: Input latency was one of the main reasons I stayed away from TV's (non CRT ones anyway) for so long. Even most of my console gaming was done on a monitor and sometimes my projector. But last year I replaced my projector with a new 120Hz freesync TV for theatre stuff, hooked up my consoles in game mode and I seriously cannot pick any latency compared to my monitor. So it seems TV's have come a long way. I've never measured it, but my view is if I don't notice it then it's okay, and I can't even notice a difference playing a racing game when moving from monitor to my TV.
Mine is 22.7ms, which is pretty good. Monitor is 4ms. I'm not saying that's a huge difference, but when flipping my aim around in an FPS I do think it's noticeable. Also response times are better on a monitor usually, which means less blur in dark scenes. Still, you are ABSOLUTELY right that TVs have come a long way in this area and I bet some newer models would be pretty darn close, especially with freesync.

My TV is 55" so it wouldn't really work as a monitor anyway, since I use mouse and keyboard for 95% of my games. My monitor already feels too big at 32"!
avatar
StingingVelvet: I'm sure there are people who have a PS4 sitting on top of a gaming PC, hooked up to the same monitor, to play exclusives on, who would like mouse control. However I'm also sure this is a very small number of people relative to the market, and that's why developers don't focus on them.
avatar
paladin181: My monitor sits on my PS4, but yes, this exact set-up.
Your PS4 needs a wind turbine and liquid nitrogen for cooling and a Co² turbo injektion system.
avatar
StingingVelvet: My TV is 55" so it wouldn't really work as a monitor anyway, since I use mouse and keyboard for 95% of my games. My monitor already feels too big at 32"!
My TV is only 50" I use it as a monitor and it KICKS SERIOUS ASS!
Post edited April 07, 2020 by fr33kSh0w2012
Googling for "tv monitor difference" gives quite many articles, I guess I will educate myself a bit more. Apparently I am not the only one thinking "why wouldn't I use my TV as a monitor", considering big TVs seem to be much cheaper than big monitors.

EDIT: Of course I guess if one wants a 144Hz monitor, then there are hardly any TV equivalents. TVs are ok for people like me who are fine with 60Hz.

Also about the burn-in I mentioned, they say that some more "exotic" TV display technologies, like OLED, can suffer from it.

EDIT: And adaptive sync. I guess you need a monitor for those as well. With TVs it is vsync or no sync, I guess.
Post edited April 07, 2020 by timppu
low rated
avatar
timppu: Googling for "tv monitor difference" gives quite many articles, I guess I will educate myself a bit more. Apparently I am not the only one thinking "why wouldn't I use my TV as a monitor", considering big TVs seem to be much cheaper than big monitors.

EDIT: Of course I guess if one wants a 144Hz monitor, then there are hardly any TV equivalents. TVs are ok for people like me who are fine with 60Hz.

Also about the burn-in I mentioned, they say that some more "exotic" TV display technologies, like OLED, can suffer from it.

EDIT: And adaptive sync. I guess you need a monitor for those as well. With TVs it is vsync or no sync, I guess.
Vsync
low rated
avatar
timppu: EDIT: Of course I guess if one wants a 144Hz monitor, then there are hardly any TV equivalents. TVs are ok for people like me who are fine with 60Hz.
I highly recommend high framerate. Most people who haven't used it much think it's no big deal, but once you start using it all the time there's pretty much no going back. Even just on the Windows desktop it's so much better. It pains me to play demanding new games like Control at 60fps. That said, most good TVs now are 120hz, so I don't think that's an issue. Though 4k at 120hz is an issue, I believe.

TVs also (mostly) only use HDMI, which lags behind DisplayPort for PC functions in certain areas (like resolution and framerate).
Consoles are becoming PCs? So I can put in and take out whatever hardware parts I want from the PS5, at any time I want? If not, then no, consoles are not becoming PCs, and they still remain vastly inferior to PCs.
avatar
StingingVelvet: I highly recommend high framerate.
(144 fps etc.). It is also a question whether one has a GPU power enough to run modern games at so high fps, with good details. Achieving stable 60 fps with good-looking graphics and a high resolution is already quite burdensome, and getting the same at 120 or 144fps over twice as hard. Unless, of course, you are ready to limit the graphics options and/or play in a lower resolution (I recall you also prefer >HD resolutions, like 4K or 8K?).

Plus, I don't like the idea that many older games have issues at over 60 fps framerates.

So yeah, I personally find 60 fps and 60 Hz (at 1920x1080 resolution) optimal at this point, all things considered.

avatar
StingingVelvet: That said, most good TVs now are 120hz, so I don't think that's an issue.
What i understood from those "can TVs be used as monitors?" articles online, the TVs are partly(?) lying when they report those higher framerates. I don't recall the specifics, but I got the impression that 120Hz on a TV is not the same as 120Hz on a computer monitor.

EDIT: E.g. here: https://www.pcworld.com/article/2924203/use-your-tv-as-a-computer-monitor-everything-you-need-to-know.html

An HDTV with a high advertised refresh rate may use post-processing technology to achieve that rate, such as by creating additional frames to upscale content, or by adding black frames between each frame to prevent image blur.
Post edited April 08, 2020 by timppu
avatar
timppu: What i understood from those "can TVs be used as monitors?" articles online, the TVs are partly(?) lying when they report those higher framerates. I don't recall the specifics, but I got the impression that 120Hz on a TV is not the same as 120Hz on a computer monitor.

EDIT: E.g. here: https://www.pcworld.com/article/2924203/use-your-tv-as-a-computer-monitor-everything-you-need-to-know.html

An HDTV with a high advertised refresh rate may use post-processing technology to achieve that rate, such as by creating additional frames to upscale content, or by adding black frames between each frame to prevent image blur.
avatar
timppu:
What you said is true of older HDMI 1.4 TV's, which were only capable of displaying at 60Hz, but many TV's used what they called "field refresh" which was just processing behind the scenes to help remove motion blur- useless for gaming as the processing adds in huge latency.

New TV's that are coming out with HDMI 2.0 and HDMI 2.1 connectivity and advertise 120Hz are true 120 Hz capable just like a monitor- they also have freesync usually as well. HDMI 2.1 can do 4K at 120Hz with HDR. HDMI 2.0 TV's can only do 120 Hz at 1440p or 1080p, or 4K at 60Hz, all with HDR. My TV is a HDMI 2.0 freesync TV, but I figured it will be some time before most hardware will be able to render new games at 4K 120fps anyway, and it still does 1440p at 120Hz which is good enough.

The long and the short of it is, check the TV's actual specifications and look for the HDMI standard it uses, make sure it has freesync (some high end OLED's even have G- sync) and check its output frequency at the common resolutions. But generally all new (this years models) OLED and QLED and MicroLED TV's are HDMI 2.1, most standard LCD's are not.
Post edited April 08, 2020 by CMOT70
avatar
timppu: (144 fps etc.). It is also a question whether one has a GPU power enough to run modern games at so high fps, with good details. Achieving stable 60 fps with good-looking graphics and a high resolution is already quite burdensome, and getting the same at 120 or 144fps over twice as hard. Unless, of course, you are ready to limit the graphics options and/or play in a lower resolution (I recall you also prefer >HD resolutions, like 4K or 8K?).

Plus, I don't like the idea that many older games have issues at over 60 fps framerates.

So yeah, I personally find 60 fps and 60 Hz (at 1920x1080 resolution) optimal at this point, all things considered.
Native resolution is the most important graphics option there is, IMO, so I wouldn't lower that. Even DLSS looks terrible to me. I did get a 1440p monitor instead of a 4k one though, to make it easier to run at native res. 1440p with decent AA is more than good enough IMO, and I've run stuff on my TV at 4k pretty often so I have a good comparison.

You're right that 144fps is tough to reach in a lot of newer games, I'm not gonna lie. I even started a thread (which was widely misunderstood) about how disappointing it was to fall in love with 144fps and then be stuck at 60 for some games. One thing to keep in mind though is that any decent 144hz monitor today will have g-sync or freesync, so you can run at any fps you want without any negative repercussions. I played Doom Eternal last week at around 100fps for example, on highest settings. It's only super demanding games like Control where I have to lock them at 60 to use decent settings, and that is a bummer.

The more I use it though, the more glad I am to have it. My perspective now is, why wouldn't you want something so good whenever you can get it, rather than never? Like I said, even on the Windows desktop 144fps is a revelation. I play a lot of indies and older games which run great at high framerate (like Doom 3 right now). I wouldn't trade it in for 4k even if you paid me. There's a reason you constantly see people praising 1440p and 144hz on PC forums. It really is that good.

I'm in no way saying you should do it now though, to be clear. I'm saying I highly recommend it over 4k60, if you're looking to upgrade. :)
Post edited April 08, 2020 by StingingVelvet
avatar
StingingVelvet: Native resolution is the most important graphics option there is, IMO, so I wouldn't lower that. Even DLSS looks terrible to me. I did get a 1440p monitor instead of a 4k one though, to make it easier to run at native res. 1440p with decent AA is more than good enough IMO, and I've run stuff on my TV at 4k pretty often so I have a good comparison.

You're right that 144fps is tough to reach in a lot of newer games, I'm not gonna lie. I even started a thread (which was widely misunderstood) about how disappointing it was to fall in love with 144fps and then be stuck at 60 for some games. One thing to keep in mind though is that any decent 144hz monitor today will have g-sync or freesync, so you can run at any fps you want without any negative repercussions. I played Doom Eternal last week at around 100fps for example, on highest settings. It's only super demanding games like Control where I have to lock them at 60 to use decent settings, and that is a bummer.

The more I use it though, the more glad I am to have it. My perspective now is, why wouldn't you want something so good whenever you can get it, rather than never? Like I said, even on the Windows desktop 144fps is a revelation. I play a lot of indies and older games which run great at high framerate (like Doom 3 right now). I wouldn't trade it in for 4k even if you paid me. There's a reason you constantly see people praising 1440p and 144hz on PC forums. It really is that good.

I'm in no way saying you should do it now though, to be clear. I'm saying I highly recommend it over 4k60, if you're looking to upgrade. :)
Agreed.

Good monitor is also one of the best investments you can make as a gamer. Even if you can't reach the max framerate, you will be future-proof for when you eventually upgrade your PC.

I am currently using a 240 Hz, 1440p monitor and older games which I can run at that feel incredible. Unreal Tournament 2004 is so incredibly smooth it's, well... unreal :P And the games I can't? G-Sync takes care of that.
Post edited April 08, 2020 by idbeholdME
low rated
avatar
StingingVelvet: Native resolution is the most important graphics option there is, IMO, so I wouldn't lower that. Even DLSS looks terrible to me. I did get a 1440p monitor instead of a 4k one though, to make it easier to run at native res. 1440p with decent AA is more than good enough IMO, and I've run stuff on my TV at 4k pretty often so I have a good comparison.

You're right that 144fps is tough to reach in a lot of newer games, I'm not gonna lie. I even started a thread (which was widely misunderstood) about how disappointing it was to fall in love with 144fps and then be stuck at 60 for some games. One thing to keep in mind though is that any decent 144hz monitor today will have g-sync or freesync, so you can run at any fps you want without any negative repercussions. I played Doom Eternal last week at around 100fps for example, on highest settings. It's only super demanding games like Control where I have to lock them at 60 to use decent settings, and that is a bummer.

The more I use it though, the more glad I am to have it. My perspective now is, why wouldn't you want something so good whenever you can get it, rather than never? Like I said, even on the Windows desktop 144fps is a revelation. I play a lot of indies and older games which run great at high framerate (like Doom 3 right now). I wouldn't trade it in for 4k even if you paid me. There's a reason you constantly see people praising 1440p and 144hz on PC forums. It really is that good.

I'm in no way saying you should do it now though, to be clear. I'm saying I highly recommend it over 4k60, if you're looking to upgrade. :)
avatar
idbeholdME: Agreed.

Good monitor is also one of the best investments you can make as a gamer. Even if you can't reach the max framerate, you will be future-proof for when you eventually upgrade your PC.

I am currently using a 240 Hz, 1440p monitor and older games which I can run at that feel incredible. Unreal Tournament 2004 is so incredibly smooth it's, well... unreal :P And the games I can't? G-Sync takes care of that.
Expensive
avatar
Crevurre: Expensive
I got the Lenovo Legion Y27gq-25 a bit before this whole Corona situation and I caught a decent discount so I bought it for around $850. Same as my GTX 1080 Ti when I got it about 2 years ago.

And unlike the GPU, unless the monitor starts showing some defects due to overuse, I won't have to buy a new one for many, many years. So definitely a worthy purchase for me.