It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×

Du kannst es mir auch in Deutsch erklären aber es ist ein englishes board, es ist daher möglich dass es nicht gerne gesehen wird.
Das habe ich schon vermutet als ich Schweiz beim Herkunftsland gelesen habe und es würde tatsächlich so einiges vereinfachen aber aus genau dem genannten Grund habe ich es vermieden.

Now you are making a comparison between a low power and a high power CPU, you are simply going beyond the line of "common sense" (or lets say a hit below the belt). Indeed many stupid smartphone owners are trying to compare smartphone CPUs with Intel CPUs, but im not one of them and the difference between RISC and CISC is another category. Its not comparable to the comparison between low and high power CPUs. Indeed, smartphone ARM CPUs may be RISC design, but its still a low power CPU, it cant be compared to a high power CPU, we dont even need to talk about. Its a totaly different design for different use.
Those two RISC CPUs are more similar than any of it compared to the CISC CPU. But true, the point was, that you cant just compare the clock of CPUs which have different architectures.

One moment! Now you are mixing some stuff here. Because a game works different than a movie. A movie isnt having jagged edges in term of same content but a game got that issue! So you are asking about a "internal improvement" of the way a game is calculating the pixels and that it will need higher "Internal resolution". But the screen resolution isnt the true issue, a 4K2K monitor can show pictures without any edges, especially on a HDTV even at a size of above 50 inch. Heck, even 1080P is able to be smooth and almost without any edges (in term AA enabled). So its not a problem of the screen resolution but the way how the game is handling a pixel.
No I am not mixing anything. Anti Aliasing was invented because the edges looked bad with low resolutions, but if your resolution is high enough, the edges start to look better and better with and without Anti Aliasing. Anti Aliasing is just a workaround because we cant have resolutions yet (for less money) that are high enough so that we had no need for Anti Aliasing anymore.

Also the power necessary to render a frame with Anti Aliasing (Super Sampling for simplicity) is the same as rendering the whole thing in a bigger resolution or even less, because no Down-sampling is necessary. It is almost always better to use a higher resolution than Anti Aliasing, if possible, because a higher resolutions adds more details than Anti Aliasing can do, but both make things smoother.

Just have a look at Retina Displays, they have no need for Anti Aliasing anymore and Text for example looks much better than with Anti Aliased Text on the usual computer screen. For a big screen where you sit close enough (what you usually do if you are sitting in front of a computer) a very high resolution is necessary to have the same effect, even 4k isnt maybe enough (if the resolution is constant over the whole screen and you have good eyes).

Once again, the TV got a fixed amount of pixels, he isnt responsible for the way how a computer is handling the internal pixels.Its true that a higher resolution would render the need of AA unnecessary
Thats the point. The result would even look better!

but even a AMD RADEON GPU is in need of much more power at higher resolutions, it doesnt come for free.
True, but same goes for Anti Aliasing. If you can render 4x AA (Super Sampling) with a Full HD resolution, you can also render to 4k, because it is the same resolution the graphics card has to render. This generation is to slow for the latest AAA games yet but one of the next generations wont be, also even todays graphics cards are fast enough for older games. I think this can happen even in the lifetime of todays next gen consoles which even often cant handle Full HD properly (especially the XBox One).

But internal resolution and TV/monitor resolution simply isnt the same and thats the reason for the edge issues any game will have in term resolution below 1080P and/or without any AA.
Of course there are differences, they were to obvious to mention, but they are the same in the fact how the graphics card has to render to.

Besides its not true that "the bigger the better" is always valid in term of gaming. In my experience a 50 inch TV at a range of 2m is close to perfect for gaming purpose. Smaller than that and there is some lack of "immersion."; bigger than that and the viewability may become a issue because simply cant have the entire screen inside same viewing angle anymore. Most beamers are already to big for competitive gaming, it cant allow max performance because they exceed the active viewing angle of human view. A 50 inch HDTV is close to perfect, thats my opinion but you can have your own.
True, but the bigger screen increases the immersion drastically. Let us talk about car driving, there you have the same active viewing angle but I doubt that you want shut the side windows, just because your active viewing angle is to low to actively see stuff there. Now let us talk about a car driving game, why shouldnt you be able to have the same view on the sides? This would increase the immersion drastically, especially because your eyes can wander around the screen so that your angle is increased in this way until 360°. The immersion is just perfect if you can look behind you and see the backseats of your virtual car.

One way to reach this, would be to build a 360° screen with a high resolution (MUCH higher than 4k), another way is stuff like Occulus Rift (maybe with dynamic resolutions, more in the center, less in the outer regions). The big viewing angle is one of the reasons why this device is so astonishing.

A eyefinity at a range of less than 1m and probably even several "30+ inch monitors" will have lot of content outside active viewing angle. Immersion could be good but it makes no sense at hardcore gaming, to much of content is outside "active viewing angle", but just as i said, to everyone theyr own "stuff", we are free when it comes to such terms (as long as the coins are available).
I dont get what this has to do with hard core gaming or why this even matters. Also I think it would be an advantage if you could turn your head left and right to see stuff that is left and right of you instead of using the mouse for this.

My own experience: 1080P at 50 inch is perfect for classic games, best quality possible. A 4K2K will be close to perfect for the newest games and future game content, although both formats are compatible with each others because simply twice the pixels (so its easy to switch without format issues).
My experience is that 1080P on 22" screens 1m in front of you is everything but enough, even with Anti Aliasing! For bigger screens you would even need a higher resolution.

Its not healthy to sit way to close to a screen for extended period, but just as i said, its not my eyes and everyone can hurt theyr own eyes. ;)
Its also not healthy to sit way to far to a screen for extended period!

Healthy is a changing distance or something where your eyes have to focus. Maybe it would be possible to focus naturally with your eyes with a next gen head mounted display, who knows?
Post edited June 24, 2014 by ThomasD313
Retina display is one huge "werbegag", plain advertisement, nothing more.

Apple is always very skilled selling new stuff without real use in many terms and i tell you why: Even at 1080P the technology is NOT fast enough not to show blurred pictures when motion is added. Many of the Apple products got high motion-sharpness-issues yet most consumers apparently arnt noticing it.

I AM sensitive to lack of motion sharpness, and i rather would have a display half the resolution but a much sharper moving picture resolution. Especially for gamers thats a huge issue because games are sometimes moving a lot and very fast! In term of a washed out pictures by a movement-blurr... your so called "rate of detail" will totaly die.... the picture wont be accurate anymore.

So all the retina displays or any of the 4K2K displayes without a OLED or another technology with super quick response time will just be useless for anyone sensitive to inaccurate picture handling, and lack of motion sharpness. But i know most people that are supporting retina display apparently got no clue about the meaning of "motion sharpness"; it is very important!

Currently i use the newest version of Panasonic Plasma TV... and i can tell you that TV got a amazing motion sharpness and only the very best of all monitors are able to compete considering that spec.

So, youre sensitive to low resolution and im sensitive to the lack of motion sharpness... thats a difficult situation.
Post edited June 25, 2014 by Xeshra
Maybe the word retina display is just marketing, but the thing it stands for isnt. I even have wondered why this wasnt made YEARS before? You can have Full HD resolutions since almost 20 years even for a price that wasnt to high. The whole thing just stopped developing, they developed even each year stuff that was worse and worse, just the cheaper IPS panels were a real improvement and because of 3D the > 100 Hz techniques.

Ok, I am not really wondering, there are some facts why they stopped increasing resolutions, here is a brief list:
1. A high pixel density results in small text and other stuff if not scaled (what didnt work very well in the past on windows systems and with software made for windows).
2. The display interfaces couldnt transfer such high resolutions or high framerates with high resolutions until DisplayPort (or Thunderbolt) was invented, because VGA or DVI are very outdated and HDMI is even complete bullshit.
3. The production wasnt good enough to produce displays with such resolutions and without defects.
4. They were just to expensive.
5. Everyone wanted to have a Full HD even if he didnt know what it stands for...
6. The actual displays were still sold good enough.

But none of them have anything to do with the motion of the displayed pictures, except the thing with the display interfaces and that high framerates with high resolutions arent possible with todays interfaces, except Displayport and Thunderbolt which are capable of doing high framerates with high resolutions.

In other words high or less resolutions have nothing to do with motion-sharpness, except that in cases where the bandwidth is to small the framerate can get lower than with a lower resolution would be possible, which is no issue for DisplayPort or Thunderbolt.

But I even dont understand what especially is your problem with this issue. Is it that the frames are blurred or that they arent blurred? Because you are contradicting yourself many times in this topic.

First you are saying that the pictures are blurred and thats why you have problems with them, then you are saying that the motion on TVs is better, which show pictures that are even much more blurred, because of their blurred media!

Now there are 2 cases with low framerates. First the shutter time of the camera was very low what would lead into an ugly motion on a screen with high response time (like on displays for computers) because it isnt blurred or the shutter time is long what would lead into a motion that is similar to the motion displayed on screens with low response time, because it is blurred.

For movies that are usually displayed on TVs the second one is usually used, which is the more blurred one, because for most humans it looks more natural (at low framerates like movies that haveframerates between 25Hz and 30Hz). While the first one is used for computer games which are usually displayed on computer displays, because the rendering of blurred images is very hard to do and would consume much performance to look good enough, so that the pictures of games are usually completely free of any blur and look bad at low framerates.

While the first behavior doesnt look perfect, is the second behavior a problem for many people. I think it is also the problem you are recognizing with displays for computers. but describing completely wrong, because the images there arent blurred and displayed on screens with high response times with low framerates (60Hz).

A simple solution for such non blurred images like shown on high response computer displays could be to increase the framerate of the input (FPS of the game) and the framerate of the screen, so that it havent to be blurred by the technique itself (like on TVs) but by the eye (because it gets to slow). This would even look much better than on any TV with its blurred 30Hz media! But both is possible with low and with high resolution displays again!

Another problem which you are maybe recognizing and having a problem with it, could be tearing, because TVs usually are synchronized to the media they are showing while computer displays are usually not synchronized to the media they are showing (because games for example dont deliver constant framerates while movies are doing so). The solution for this is to use VSync and or wait for FreeSync or Gsync which are being invented and implemented right at the moment and will deliver better results. Also this has again noting to do with resolution.

Another thing could be that you are using a TV that has a higher framerate than its input material and you started to like the interpolation or the black frames techniques, but I would never say that such things (especially the interpolation) are really the way to go, because they are just workarounds to deal with the low framerate of the input media that in my opinion let the whole motion look very very unnatural and odd.

All in all, this topic is interesting but has again almost nothing to do with resolutions because all of this behaviors and facts can be true or false for low as for high resolution displays and often the media shown on the display is the problem and not the display itself.
Post edited June 28, 2014 by ThomasD313
No, when there is 2 times the amount of pixels the panel need to switch the pixels 2 times faster in order to move a certain object from point A to point B, sounds pretty logical to me. High resolution is a demanding attribute and it isnt free... its a hard cookie to tackle and a usual IPS LED-LCD panel is far to slow for such a demanding resolution. Really, you have wrong expectations what it means to be able to deliver a good "high resolution" picture, the weak spot is the monitor, not your gamer PC. I would stop thinking in the way of "resolutions" all the time, resolution is one of many quality aspects but not the only one. A very good picture is in need of many attributes and the capability of a monitor or TV is a very big matter... resolution is meaningless in term a monitor is junk.

Framerate of 30 FPS may produce judder, but nothing to do with motion sharpness, do not mistaken judder with motion sharpness, its 2 totaly different attributes. Framerate of 60 FPS or more will not produce judder anymore, but the motion sharpness will become a issue. The TV or monitor need a reaction time of 1 ms or lesser at 1080P in order to have full motion sharpness. On a 4K it will certainly need 0.5 ms or even lower, else there is blurring possible.

I do not use interpolation, thats a tech for the weaklings, those who dont know what it means to have picture quality. My TV is running on a raw 1 to 1 picture mode, its only showing the true material and nothing "artificial". Interpolation is just a dirty trick in order to remove judder, nothing else, but the overall quality is not raised at all because you cant produce accurate material "out of nowhere". Every true movie lover is using the raw movie style and a gamer is not in need of interpolation, its useless. A TV will produce picture input LAG when interpolation is active (processing will create a spike-like delay of up to 100 ms), thats messy for gamers and once again. at 60 FPS its absolutly useless.

avatar
ThomasD313: For movies that are usually displayed on TVs the second one is usually used, which is the more blurred one, because for most humans it looks more natural (at low framerates like movies that haveframerates between 25Hz and 30Hz).
Thats not lack of motion sharpness, thats simply frame interpolation. It may result into blurred or inaccurate images but it got nothing to do with the technically limited motion sharpness issue. You are trying to tell me every single possibility in order to make it suffer from something "blurring-like" but not the thing that truly matters... THE STUFF THE MONITOR IS ABLE TO HANDLE. Think about the true meaning .

avatar
ThomasD313: Another thing could be that you are using a TV that has a higher framerate than its input material and you started to like the interpolation or the black frames techniques, but I would never say that such things (especially the interpolation) are really the way to go, because they are just workarounds to deal with the low framerate of the input media that in my opinion let the whole motion look very very unnatural and odd.
Thats just guessing, in my whole history i never used interpolation and i never ever enjoyed it. If i truly like interpolation i may have used Samsung junk and smile every single day for the supreme picture-soup. In term im watching a anime movie using interpolation, im rather gonna make Hara-kiri, instead of continuing with such a honorless action.

Im a die hard lover of the most natural and extremely sharp and crispy picture... only a few monitors are barely capable to handle it, one of them is a plasma flagship.

Tearing is no issue, every game usualy got a VSYNC mode and it will normalize the FPS rate. Tearing is usualy only appearing when the internal framerate is much higher than the TV/monitor framerate of 60 Hz.

The currently best monitors are around 2 ms, LED-LCD tends to be a bit higher than that, the fastest Plasmas are around 2 ms average, thats still to high but its very good with low blurring or artifacting.

The only monitors with sufficient speed is Amoled or any form of true LED, but those monitors are still under heavy development and they may become mature in around 5 years but at current time they are way to immature and extremely expensive.

I would never get any other screen technology other than true LED in term i use any resolution higher than 1080P. Regarding 1080P, for gaming purpose, i use plasma only (the fastest plasma ever made with 3000 Hz subfielddrive).

You may have good general hardware and software knowledge but when it comes to picture hardware you seems to have a weak spot, you seem to always focus on spots that are unable to produce quality in term the technology is insufficient as a whole. You are looking for stuff that got nothing to do with the capability of providing moving picture sharpness, thats a technology related limitation and its hard to improve that value (no matter the stuff a manufacturer is telling you).

I am not using any dirty trick at all, nor any trick ever, because im a lover of the true picture directly from the raw source.

Interlacted matter:
Of course when a picture is made in interlacted, there will be sharpness issues related to the media because its not providing a full progressive image. Interlacing is a dirty trick in order to reduce the limitations "insuficient technology" got but a new TV of newest spec is not having any issue with progressive pictures (such as the ones gotten from any game), so thats not a matter at all. I was studying picture related stuff for way to long so its not easy to fool me by nonsense.

And i dont know why you are hyping the very old VGA standart and bad words for HDMI, HDMI got its advantages too and is useful in many terms. DVI isnt having a audio channel, so anyone using DVI will need a separated audio line. Ultimately those 2 standarts are compatible to each others, because the signal can be converted to each others without any loss in quality, so i dunno why you are telling me that DVI is good and HDMI bad, both are basically using the same picture technology but HDMI got a different interface.

Seriously... weird stuff you seem to tell me, im a picture enthusiast for a reason and i know pretty well how a good picture looks like and what to use.

Besides just tell me your monitor and i may be able to tell you how hard motion sharpness is able to hit you, its clear that you will suffer a sharpness issue unless you use AMOLED or another true LED technology. But just as i said, that technology is still immature and got many weak spots at current time (affecting general picture quality and lifetime).

Im almost certain that i may not even want your "super high res" monitor for free because it may lack all the other important picture values i love so much. General color quality, black levels, motion sharpness and even more stuff. The very old LCD based panels had such a terrible picture quality that not even 1 billion pixel would have been of use, in fact it would only make the matter worse. General picture quality and amount of pixels got very few relationship. Indeed more pixels can be of advantage for even more details but its still just one of many factors and not necessarely the most crucial one.

In the old days i enjoyed the crispy CRT picture much more than the "super high resolution" of a computer monitor because the computer monitor had lot of details yes, but the picture was always "grayed out", it simply was lacking the crispy and sharp picture from a good CRT TV. The CRT had lower resolution YES, but the picture looked more realistic, i know you are unable to understand those words, so its fine just to read and forget about.

But PC was never the solution to everything, originally it started to become a "worker-machine" so the bureaucrats can attack innocent people by huge documents and economic system related matters, it wasnt meant to be the entertainment machine its used to be nowadays. Retina stuff is same use, its for standing picture superiority for mainly the older mature "economic system controlling" generation, so the people are able to see the documents way better than ever before, so they can read even more letters on a single screen (using a binocular, sometimes... when bad eyesight). Ultimately they can put even more law matters on a single screen, so huge resolution is truly very beneficial for that matter, but not for real gamers that are in need of "moving picture sharpness", they should rather wait for a mature and proper AMOLED, i think. Of course, a plasma is a very bad bureaucrat display, its not good for standing white pictures with black letters on it, so everyone got different needs, thats understandable.
Post edited July 04, 2014 by Xeshra
No, when there is 2 times the amount of pixels the panel need to switch the pixels 2 times faster in order to move a certain object from point A to point B, sounds pretty logical to me. High resolution is a demanding attribute and it isnt free... its a hard cookie to tackle and a usual IPS LED-LCD panel is far to slow for such a demanding resolution. Really, you have wrong expectations what it means to be able to deliver a good "high resolution" picture, the weak spot is the monitor, not your gamer PC. I would stop thinking in the way of "resolutions" all the time, resolution is one of many quality aspects but not the only one. A very good picture is in need of many attributes and the capability of a monitor or TV is a very big matter... resolution is meaningless in term a monitor is junk.
The problem is again, that you dont understand the technique, so that your thoughts about it are wrong. Yes you have to refresh 4x more pixels for a resolution that is 2x bigger in each dimension, so that this leads into a pixel frequency that have to be 4x higher. The result is that it is more difficult to build panels with such resolutions, because they need a 4x bigger pixel frequency, so that this can lead into lower refresh rates, so that 60 Hz will become 15 Hz with the same pixel frequency. The refresh rate is directly affected by the pixel frequency and otherwise, the formula is:
refreshRate = pixelFrequency / (horizontalResolution * verticalResolution)
But it is of course possible to build panels with higher resolutions that have the same refresh rate as panels with lower resolutions, they are just more expensive, thats all.

The time that is necessary for each pixel to switch from one color to another color in contrast is completely independent to the refresh rate, pixel frequency and resolution at all. The pixels do shift their colors completely independently of their surrounding pixels after they got the signal to do so. The time that is necessary until they get the signal is the only thing that is affected by the resolution. The way how this time until the signal is affected, is described before with the pixel frequency, where I also explained that this isnt really an issue, because a 60Hz panel is still a 60Hz panel, independently of resolution.

So to sum all in, you are just completely wrong in mixing resolution with the reaction time of each pixel, because this 2 things have absolutely nothing to do with each other...
The TV or monitor need a reaction time of 1 ms or lesser at 1080P in order to have full motion sharpness. On a 4K it will certainly need 0.5 ms or even lower, else there is blurring possible.
Just no! I hope the part above explained how all this works and that this is just nonsense.
do not mistaken judder with motion sharpness, its 2 totaly different attributes
I dont, but I doubt that you understand the effects you are talking about and how they are connected to each other.

Let us talk of a screen with perfect reaction times of 0. If you run this with 30 FPS with perfect motion sharpness you will get sick of the image you will get, because humans dont get problems with missing motion sharpness but they get problems with to much motion sharpness with to low refresh rates.

You say yourself that you have problems with the images that are produces by computers on computer monitors but you dont have problems with movies on the TV but the movies on TV have very, very, very less motion sharpness, while the images generated by the computer on computer monitors do have motion sharpness! In case you are able to do so, just stop the picture while it is showing a motion of a movie on the TV and you will see how sharp the motion really is (there is almost no motion sharpness).

Here the missing motion sharpness is generated by the camera system which was taken to create the movie and not the screen that is showing the movie, a bad screen with high latency could build a similar (but less natural) image with missing motion sharpness, the problem is that computer monitors dont have high enough latencys to do so and that they dont show material that has the missing motion sharpness in it, like movies do.

To sum it up, the problem you are experiencing is that your eye and brain recognizes that the images arent natural, because in natural, motions arent sharp, they are washed out! To trick the brain so that you dont experience any problems anymore, you NEED washed out motions (missing motion sharpness) and not not washed out motions (motion sharpness).

The ways to achieve this are to reduce motion sharpness in the material showing (like movies on the TV do), the device that is showing the material (a monitor with bad latency > 10 or maybe 20ms) or such high (and constant) frame and refresh rates that the eye isnt anymore able to see the sharpness in the motion. Or in other words, you need something completely contrary than you are talking about all the time.
at 60 FPS its absolutly useless.
60 FPS arent enough to trick the eye and brain enough so that the problems some people, like you, are experiencing with motion pictures, disappear. Just for an example the retina of the eye shuts more than 1000 times per second. Even if the brain and parts between the retina and the brain (approximately 25 to 30 Hz) are much slower than the retina itself, the information isnt completely lost, it is still transfered to a degree.

It is like you would blend many images into one, each image wont be lost at all too, even if the information of each image will get less value than it would have without blending.
THE STUFF THE MONITOR IS ABLE TO HANDLE. Think about the true meaning .
The problem isnt the stuff the monitor can handle it is that you mixed stuff and dont understand what really causes the problems you are talking about!
Tearing is no issue, every game usualy got a VSYNC mode and it will normalize the FPS rate. Tearing is usualy only appearing when the internal framerate is much higher than the TV/monitor framerate of 60 Hz.
You are again misinformed. VSync doesnt remove tearing if not used in combination with triple buffering. Without triple buffering it is just reducing the frames rendered to the framerate of the monitor nothing else. Also it is wrong that tearing only appears with framerates that are higher than the refresh rates of the monitor, they appear if the images arent send synchronously with the monitors framerate to the monitor.
The only monitors with sufficient speed is Amoled or any form of true LED, but those monitors are still under heavy development and they may become mature in around 5 years but at current time they are way to immature and extremely expensive.
This might be true for big screens but isnt true for small screens like used in smartphones. Here are OLED displays nothing unusual anymore and did you know that they are often used for screens with high resolutions, such displays you are blaming for missing motion sharpness?
You may have good general hardware and software knowledge but when it comes to picture hardware you seems to have a weak spot, you seem to always focus on spots that are unable to produce quality in term the technology is insufficient as a whole. You are looking for stuff that got nothing to do with the capability of providing moving picture sharpness, thats a technology related limitation and its hard to improve that value (no matter the stuff a manufacturer is telling you).
The thing is that you dont understand how everything is connected to and I doubt that you even have basic knowledge about how the human perception works, which is a big if not the biggest part of this subject. So the biggest problem isnt the hardware, it is the perception and the way how to trick it and how to trick it has of course to do with the stuff I mentioned. I hope that I cleared the connection out this time. Also I hope the explanation was good enough why your understanding of the monitor hardware was almost wrong so that you drew wrong conclusions out of it.
Of course when a picture is made in interlacted, there will be sharpness issues related to the media because its not providing a full progressive image. Interlacing is a dirty trick in order to reduce the limitations "insuficient technology" got but a new TV of newest spec is not having any issue with progressive pictures (such as the ones gotten from any game), so thats not a matter at all. I was studying picture related stuff for way to long so its not easy to fool me by nonsense.
Welcome in a club, it was also a part of my studies but I had no problems in understand all this quick enough so that I didnt had to study it for a way to long as you did.

An actual monitor has of course no problem with progressive images, but the problem is that it usually doesnt get such progressive images, almost any media the TV gets isnt progressive, it is interlaced! Interlaced images look of course bad, no matter which monitor you are using. Also I know about de interlacing filters, but they are just again workarounds like interpolation and such stuff!
And i dont know why you are hyping the very old VGA standart and bad words for HDMI, HDMI got its advantages too and is useful in many terms. DVI isnt having a audio channel, so anyone using DVI will need a separated audio line. Ultimately those 2 standarts are compatible to each others, because the signal can be converted to each others without any loss in quality, so i dunno why you are telling me that DVI is good and HDMI bad, both are basically using the same picture technology but HDMI got a different interface.
First, where did I hype VGA? I just cant remember, can you show me position?
Did you really study this stuff?
This really doesnt look like you did,
Post edited July 05, 2014 by ThomasD313
but otherwise this would explain why you took that long. DVI can of course transmit Audio and this is of course done since a very long time, so there is NO necessity for a separated audio line! Also you are wrong in saying that the signal is compatible because it can be converted, it is compatible because it IS almost the same signal! HDMI is just DVI with other incompatible connectors where stuff was removed and stuff like DRM (also supported by DVI but not necessity) and a license fee for this bullshit was added.

General color quality, black levels, motion sharpness and even more stuff.
Motion sharpness has nothing to do with higher resolutions! Black levels have nothing to do with higher resolutions!

The only thing that can be really affected by higher resolutions is color quality and no the color quality isnt decreased with higher resolution it is contrary increased, especially if the pixels are small enough so that they cant distinguished anymore because dithering can be used to increase the amount of colors without the bad effects that dithering has with low resolutions because the colors blend in perfectly with resolutions high enough so that pixels cant distinguished anymore.

Did you know that printers can only print a few colors and that they reach all the intermediate colors just with dithering with points that are small enough so that they cant be distinguished anymore? Why should the same stuff that makes printed work look so great be bad for screens?

In the old days i enjoyed the crispy CRT picture much more than the "super high resolution" of a computer monitor because the computer monitor had lot of details yes, but the picture was always "grayed out", it simply was lacking the crispy and sharp picture from a good CRT TV. The CRT had lower resolution YES, but the picture looked more realistic, i know you are unable to understand those words, so its fine just to read and forget about.
It looked more realistic yes, but it was never more sharp and crisp than todays monitors and I have even seen CRTs that were more expensive than 1000€! The complete opposite is the case, todays screens look less natural (or realistic how you wrongly call it) BECAUSE they are more sharp and crisp! The pixels can be distinguished, while the pixels on a CRT was blended into the surrounded pixels.

Only a monitor with a resolution that is high enough so that you cant distinguish the pixels anymore can produce a picture that looks similar or even much better! This is again the same problematic as with the motion sharpness, the resolution (in time or in pixels) is just to low to allow natural looking blending.

In general it is the same contradiction as with analog and digital devices techniques. Analog devices produces as long as the digital devices have to low resolutions the better results. But if the digital device reach a resolution high enough they get much better than the analog devices. Best examples are Audio and Broadcasting.

But PC was never the solution to everything, originally it started to become a "worker-machine" so the bureaucrats can attack innocent people by huge documents and economic system related matters, it wasnt meant to be the entertainment machine its used to be nowadays
Again you are completely wrong. The worker machines were mainframes where just terminals were connected to. It was the approach which IBM and others were following. Apple at the opposite had another thought about computers, so that they built computers for everyone, so called personal computers (short: PCs). For everyone means also for entertaining not only but also for working.

The IBM-PC (the PC we know today) just was a reaction to apple, just remember the well known cite of a man of IBM who said that there exist just a market for only 5 people or so.

Retina stuff is same use, its for standing
it is standing for the fact that digital screen technologies are far enough today so that they can surpass analog devices because the possible resolution is today high enough to do so.

but not for real gamers that are in need of "moving picture sharpness"
They are in need of other stuff than this, like higher refresh rates (resolution in time, because this is still to low to completely blend in) and lower latencies, both arent contradictory to higher resolutions that would also increase the image quality...
Post edited July 05, 2014 by ThomasD313
avatar
Xeshra: That game is close to unplayable to me but a typical PC gamer would be unable to understand.

So i need to provide a few background first, or even a lot of background because i know just a few people would understand why i even ask for a gamepad:
I come from the console scene and one of my first games i ever played was Super Mario World on SNES near 20 years ago. Most of the time i was playing on consoles, with the exception a few PC only games, for example RTS genre and space sim and other PC unique games (Freespace 2, Anno series, WoW, EVE Online and much more).

GOG kinda changed the way i think about PC games and they got ride of one of the biggest nuisance which made PC games virtually without value to me, im not just a gamer im a collector too. Being a collector means that i do value games that are in a very good shape, not a DRM hog nor a "incomplete junk piece" and anything like that. Most of the true collectors consider any of the PC games close to "trash" when it comes to the collector value, thats opposite of many console games (i know, most people lack to understand because they was never a collector).

Anyway, i am very much used to gamepad as my primary input device. There is just a few games and/or genre a KB + Mouse is easyer to use for me for example: Shooter and RTS (CIV series, Anno series and much more), so it truly depends on genre. Some game works better with KB/mouse and some with gamepad, there is no "general rule". There is no "bad input device" but the "wrong input device", dependable on situation.

However, the genre i really have hard time using that KB/mouse crap is the RPG genre in general (with the exception of WoW or other online games). Risen is a RPG a game pad is supreme and simply better than a KB/mouse setup, but unfortunately the devs had in mind not to put any support for gameplad at all, So PC gamers apparently have never seen a gamepad, even if there has been good PC gamepads on the market for close to 10 years.

WHY?! There is a XBOX360 version out there with perfect gamepad support, but the PC version is just without any support at all? How comes? For countless years there is a XBox360 pad for Windows available to the PC gamers and it would be very easy to implement a controller-mode already in existance, the one used for Xbox360!

Well, im not interested into making a war, but fact is that minoritys just wont be supported for whatever reason, but i find minoritys just as important than majoritys... everyone is equal with equal rights and needs. Thats why we have GOG for example, so the DRM haters can have DRM free and the others can go and kiss Steam theyr whole lifetime or up to the time theyr account has been canceled (sorry for knocking at someones door, but minoritys are important too).

So, is there any workaround or probably a patch able to make it playable to me? II know that (almost...) all the PC gamers have zero issue using it (so they dont even think about) because they are USED to it but i know the advantage of gamepad and that game is one of them in need of gamepad. Sad to see that the "dev teams" from Xbox360 and PC do not work with each others and not trying to share ressources and support. Apparently they just execute theyr own work and in term a dev from one of the teams is saying" hello, i have a pad support" the dev from the other "political party" may say "who are you, never seen and dont know you, we have our own politics so just be gone". I think thats the main issue of everything that is odd today, we dont work with each others.

Anyway, i got it from GOG for just 2$. no need to be sad, i like to support GOG. But somehow i feel like i could have fun playing that game... if i only could use gameplad. ;) Else i simply have to be stick to the Xbox360 version, seems to be much better deal to me (as a person not a hardcore PC gamer and not a PC gamer-freak, not a "sit on a table/desk lover" too). Besides i use a HDTV... yes im a minorty and i enjoy the way i play it.

Hope RISEN 2 (TWO) will have gamepad support, im downloading now, in term another lack of PC gamepad support... i may cry).

Besides:
Just to keep in mind: What it is that makes PC gaming better than console? Supreme graphics? Honestly, half of the "well looking" games got weak gameplay and wont stand for the "best games of all time", it doesnt matter how much graphics they can provide when gameplay is lacking. The thing PC gaming is truly superior at: They can offer the highest versatility, it can simply be "tuned up" for every single setting and every single need, including input device, thats the strongest spot of PC gaming, not the "supreme graphics".
I'm a pc and console gamer. My First pc was an apple IIe back in 1984 and Colecovision in 1980's. I played computer games as well as console games. I used to use the keyboard/mouse but now with playing on my couch on my 80 inch tv I also prefer to use a controller. PC games can be both to suit each person's preference.