No, when there is 2 times the amount of pixels the panel need to switch the pixels 2 times faster in order to move a certain object from point A to point B, sounds pretty logical to me. High resolution is a demanding attribute and it isnt free... its a hard cookie to tackle and a usual IPS LED-LCD panel is far to slow for such a demanding resolution. Really, you have wrong expectations what it means to be able to deliver a good "high resolution" picture, the weak spot is the monitor, not your gamer PC. I would stop thinking in the way of "resolutions" all the time, resolution is one of many quality aspects but not the only one. A very good picture is in need of many attributes and the capability of a monitor or TV is a very big matter... resolution is meaningless in term a monitor is junk.
The problem is again, that you dont understand the technique, so that your thoughts about it are wrong. Yes you have to refresh 4x more pixels for a resolution that is 2x bigger in each dimension, so that this leads into a pixel frequency that have to be 4x higher. The result is that it is more difficult to build panels with such resolutions, because they need a 4x bigger pixel frequency, so that this can lead into lower refresh rates, so that 60 Hz will become 15 Hz with the same pixel frequency. The refresh rate is directly affected by the pixel frequency and otherwise, the formula is:
refreshRate = pixelFrequency / (horizontalResolution * verticalResolution)
But it is of course possible to build panels with higher resolutions that have the same refresh rate as panels with lower resolutions, they are just more expensive, thats all.
The time that is necessary for each pixel to switch from one color to another color in contrast is completely independent to the refresh rate, pixel frequency and resolution at all. The pixels do shift their colors completely independently of their surrounding pixels after they got the signal to do so. The time that is necessary until they get the signal is the only thing that is affected by the resolution. The way how this time until the signal is affected, is described before with the pixel frequency, where I also explained that this isnt really an issue, because a 60Hz panel is still a 60Hz panel, independently of resolution.
So to sum all in, you are just completely wrong in mixing resolution with the reaction time of each pixel, because this 2 things have absolutely nothing to do with each other...
The TV or monitor need a reaction time of 1 ms or lesser at 1080P in order to have full motion sharpness. On a 4K it will certainly need 0.5 ms or even lower, else there is blurring possible.
Just no! I hope the part above explained how all this works and that this is just nonsense.
do not mistaken judder with motion sharpness, its 2 totaly different attributes
I dont, but I doubt that you understand the effects you are talking about and how they are connected to each other.
Let us talk of a screen with perfect reaction times of 0. If you run this with 30 FPS with perfect motion sharpness you will get sick of the image you will get, because humans dont get problems with missing motion sharpness but they get problems with to much motion sharpness with to low refresh rates.
You say yourself that you have problems with the images that are produces by computers on computer monitors but you dont have problems with movies on the TV but the movies on TV have very, very, very less motion sharpness, while the images generated by the computer on computer monitors do have motion sharpness! In case you are able to do so, just stop the picture while it is showing a motion of a movie on the TV and you will see how sharp the motion really is (there is almost no motion sharpness).
Here the missing motion sharpness is generated by the camera system which was taken to create the movie and not the screen that is showing the movie, a bad screen with high latency could build a similar (but less natural) image with missing motion sharpness, the problem is that computer monitors dont have high enough latencys to do so and that they dont show material that has the missing motion sharpness in it, like movies do.
To sum it up, the problem you are experiencing is that your eye and brain recognizes that the images arent natural, because in natural, motions arent sharp, they are washed out! To trick the brain so that you dont experience any problems anymore, you NEED washed out motions (missing motion sharpness) and not not washed out motions (motion sharpness).
The ways to achieve this are to reduce motion sharpness in the material showing (like movies on the TV do), the device that is showing the material (a monitor with bad latency > 10 or maybe 20ms) or such high (and constant) frame and refresh rates that the eye isnt anymore able to see the sharpness in the motion. Or in other words, you need something completely contrary than you are talking about all the time.
at 60 FPS its absolutly useless.
60 FPS arent enough to trick the eye and brain enough so that the problems some people, like you, are experiencing with motion pictures, disappear. Just for an example the retina of the eye shuts more than 1000 times per second. Even if the brain and parts between the retina and the brain (approximately 25 to 30 Hz) are much slower than the retina itself, the information isnt completely lost, it is still transfered to a degree.
It is like you would blend many images into one, each image wont be lost at all too, even if the information of each image will get less value than it would have without blending.
THE STUFF THE MONITOR IS ABLE TO HANDLE. Think about the true meaning .
The problem isnt the stuff the monitor can handle it is that you mixed stuff and dont understand what really causes the problems you are talking about!
Tearing is no issue, every game usualy got a VSYNC mode and it will normalize the FPS rate. Tearing is usualy only appearing when the internal framerate is much higher than the TV/monitor framerate of 60 Hz.
You are again misinformed. VSync doesnt remove tearing if not used in combination with triple buffering. Without triple buffering it is just reducing the frames rendered to the framerate of the monitor nothing else. Also it is wrong that tearing only appears with framerates that are higher than the refresh rates of the monitor, they appear if the images arent send synchronously with the monitors framerate to the monitor.
The only monitors with sufficient speed is Amoled or any form of true LED, but those monitors are still under heavy development and they may become mature in around 5 years but at current time they are way to immature and extremely expensive.
This might be true for big screens but isnt true for small screens like used in smartphones. Here are OLED displays nothing unusual anymore and did you know that they are often used for screens with high resolutions, such displays you are blaming for missing motion sharpness?
You may have good general hardware and software knowledge but when it comes to picture hardware you seems to have a weak spot, you seem to always focus on spots that are unable to produce quality in term the technology is insufficient as a whole. You are looking for stuff that got nothing to do with the capability of providing moving picture sharpness, thats a technology related limitation and its hard to improve that value (no matter the stuff a manufacturer is telling you).
The thing is that you dont understand how everything is connected to and I doubt that you even have basic knowledge about how the human perception works, which is a big if not the biggest part of this subject. So the biggest problem isnt the hardware, it is the perception and the way how to trick it and how to trick it has of course to do with the stuff I mentioned. I hope that I cleared the connection out this time. Also I hope the explanation was good enough why your understanding of the monitor hardware was almost wrong so that you drew wrong conclusions out of it.
Of course when a picture is made in interlacted, there will be sharpness issues related to the media because its not providing a full progressive image. Interlacing is a dirty trick in order to reduce the limitations "insuficient technology" got but a new TV of newest spec is not having any issue with progressive pictures (such as the ones gotten from any game), so thats not a matter at all. I was studying picture related stuff for way to long so its not easy to fool me by nonsense.
Welcome in a club, it was also a part of my studies but I had no problems in understand all this quick enough so that I didnt had to study it for a way to long as you did.
An actual monitor has of course no problem with progressive images, but the problem is that it usually doesnt get such progressive images, almost any media the TV gets isnt progressive, it is interlaced! Interlaced images look of course bad, no matter which monitor you are using. Also I know about de interlacing filters, but they are just again workarounds like interpolation and such stuff!
And i dont know why you are hyping the very old VGA standart and bad words for HDMI, HDMI got its advantages too and is useful in many terms. DVI isnt having a audio channel, so anyone using DVI will need a separated audio line. Ultimately those 2 standarts are compatible to each others, because the signal can be converted to each others without any loss in quality, so i dunno why you are telling me that DVI is good and HDMI bad, both are basically using the same picture technology but HDMI got a different interface.
First, where did I hype VGA? I just cant remember, can you show me position?
Did you really study this stuff?
This really doesnt look like you did,