It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
GameRager: Does vector scale text with graphics when using higher resolutions? Just curious. I've heard the term before with lighting in games and also vector fog but not much else.

I would still follow the PS:T modding guide which allows one to install/patch/mod for various things like resolution-etc though when installing it to test it, as it makes the process go as pain free and easy as possible.
avatar
orcishgamer: Yes, instead of drawing a line, vector defines a line, so it can be drawn at any resolution, in theory with no jaggies (instead of trying to scale down or up an already drawn line). Probably someone who knows more than just a teeny bit about it can explain it better.
Sounds nice. Besides all that I wish games used higher resolution/definition source graphics and assets(I hear often that game developers develop the game using high res assets like textures/etc and then "dumb it down" to varying degrees when they start pressing it to discs or making installers for online sales as user's PCs are supposedly less powerful by alot than the server farms they use to develop the things.
avatar
cogadh: I am intimately familiar with adding custom resolutions (its the only way I can get fixed aspect ratio scaling to work, long story) but it doesn't matter, if it is a res that the monitor does not normally support, it will always look like crap. Maybe it is just my monitor, but trying to run anything other than the limited resolutions it already supports results in an image that looks like the screen is covered with a thick coating of Vaseline.
avatar
kalirion: If you're running a resolution your monitor does not support, you must be doing it through the driver custom resolutions, and if that looks bad that's the vid card or driver setting's fault, not monitor's.
No, it's not. If I try running a resolution my monitor does not support, the monitor automatically tries to scale it up to the native resolution, resulting in a poor image. Trust me, I have dealt with the quirks of this monitor for quite a while now, it is definitely the particular hardware at fault here. If I could turn off this monitor's scaling functions, I could probably use the video card drivers to get a new resolution that looks good, but I can't, the monitor does not allow for that.
avatar
cogadh: No, it's not. If I try running a resolution my monitor does not support, the monitor automatically tries to scale it up to the native resolution, resulting in a poor image. Trust me, I have dealt with the quirks of this monitor for quite a while now, it is definitely the particular hardware at fault here.

If I could turn off this monitor's scaling functions, I could probably use the video card drivers to get a new resolution that looks good, but I can't, the monitor does not allow for that.
That's very weird. A monitor usually isn't even able to show anything for a non-supported resolution, other than a "bad signal" error or whatever.

It really looks to me that your GPU drivers are doing all the scaling for unsupported resolutions. A monitor cannot "forbid" a GPU to do its own scaling. The monitor doesn't even know if the GPU does any scaling - all it knows is that it gets a signal of some resolution from the video card - not what the original resolution was. So if it's a 1080p monitor and the GPU scales a 640x480 image to 1920x1080, the monitor is non the wiser.


What is your video card? How are you trying to do GPU scaling?
avatar
cogadh: No, it's not. If I try running a resolution my monitor does not support, the monitor automatically tries to scale it up to the native resolution, resulting in a poor image. Trust me, I have dealt with the quirks of this monitor for quite a while now, it is definitely the particular hardware at fault here.

If I could turn off this monitor's scaling functions, I could probably use the video card drivers to get a new resolution that looks good, but I can't, the monitor does not allow for that.
avatar
kalirion: That's very weird. A monitor usually isn't even able to show anything for a non-supported resolution, other than a "bad signal" error or whatever.

It really looks to me that your GPU drivers are doing all the scaling for unsupported resolutions. A monitor cannot "forbid" a GPU to do its own scaling. The monitor doesn't even know if the GPU does any scaling - all it knows is that it gets a signal of some resolution from the video card - not what the original resolution was. So if it's a 1080p monitor and the GPU scales a 640x480 image to 1920x1080, the monitor is non the wiser.


What is your video card? How are you trying to do GPU scaling?
Trust me, its not the video card. I've spent enough time messing with both the monitor and the graphics card to be 110% sure of that. My monitor does its own built-in scaling and it cannot be disabled. For example, if I enable fixed aspect ratio scaling on the video card and set a game to run at 1024X768, the monitor ignores the video card's scaling and scales the image up to 1680X1050 on its own, complete with horizontal stretching and the "muddying" of the image. The only way to get fixed aspect ratio scaling to work is to create a custom resolution that "tricks" the monitor into thinking it is already getting a 1680X1050 signal while only displaying the image in a properly scaled 4:3 section of the screen (its a complicated process). I can't do the same thing with a widescreen resolution because it would simply create a "fully boxed" view (a black box around the entire image), similar to the "pillar boxed" view I get with a 4:3 resolution (pillar boxing = OK, fully boxed = sucks).
avatar
Wishbone: Also, higher resolutions won't necessarily make 3D games look better, sometimes the opposite. Older 3D games used much fewer polygons in their models than modern games do. A model that looked fine in 800x600 on a CRT monitor will almost certainly look like crap rendered in crisp 1920x1080 on a high quality LCD display.

And even if 3D models don't get pixellated when they are scaled up, the textures they use, do. Which looks even more out of place in high resolutions.
Sorry but no, really.
3d games of the hardware accelerated era ( 1995-1996 > ) always had crazy high resolutions. Quake 1 (1996) always supported very high resolutions. And higher resolution in 3d doesn't mean "more blurry textures" It just means less visible aliasing and less wobbly polygons. Texture resolution is not related to your screen resolution in any way.

Also keep in mind that old monitors used native resolutions of 800x600, 1024x768 and so on. New monitors downscale the image at that resolutions, and you end up with a blurry image that looks way worse than it did on an older monitor.

When choosing a resolution is always good to keep it multiple of 2 compared to your monitor standard resolution, this way if you want to play a game at a lower resolution at least you will have a block of 4 real monitor pixels for any pixel in the game (x2) or a block of 6 (x3). If you choose any resolution between those you'll end up with blurry pixels, same thing if the aspect ratio is different.



avatar
cogadh: Trust me, its not the video card. I've spent enough time messing with both the monitor and the graphics card to be 110% sure of that. My monitor does its own built-in scaling and it cannot be disabled. For example, if I enable fixed aspect ratio scaling on the video card and set a game to run at 1024X768, the monitor ignores the video card's scaling and scales the image up to 1680X1050 on its own, complete with horizontal stretching and the "muddying" of the image. The only way to get fixed aspect ratio scaling to work is to create
It's because a game pixel is not equal to one screen pixel, or a block of 4 (2x2), or a block of 6 (3x3) and so on.
It's not something you can disable, is simple math. If you want to put an image that's big 800x600 on a monitor that has 1980x1024 lights you can't do that without interpolating the color, as a single pixel will be bigger than 2x2 but smaller than 3x3.
Even worse if also the aspect ratio is different, because you'll get an additional horizontal only blurring due to the image interpolation, as the resulting pixel is not square
Post edited January 20, 2011 by Eclipse
avatar
Eclipse: block of 6 (x3).

or a block of 6 (3x3) and so on.
You mean 9 right?
argh... yes sorry -__-
avatar
cogadh: Trust me, its not the video card. I've spent enough time messing with both the monitor and the graphics card to be 110% sure of that. My monitor does its own built-in scaling and it cannot be disabled. For example, if I enable fixed aspect ratio scaling on the video card and set a game to run at 1024X768, the monitor ignores the video card's scaling and scales the image up to 1680X1050 on its own, complete with horizontal stretching and the "muddying" of the image. The only way to get fixed aspect ratio scaling to work is to create
avatar
Eclipse: It's because a game pixel is not equal to one screen pixel, or a block of 4 (2x2), or a block of 6 (3x3) and so on.
It's not something you can disable, is simple math. If you want to put an image that's big 800x600 on a monitor that has 1980x1024 lights you can't do that without interpolating the color, as a single pixel will be bigger than 2x2 but smaller than 3x3.
Even worse if also the aspect ratio is different, because you'll get an additional horizontal only blurring due to the image interpolation, as the resulting pixel is not square
While all that is (mostly) true, that is not at all what I am talking about. I already know why a scaled up image looks like crap, the point was to explain why my monitor is the source of my dilemma in this particular circumstance. Most decent widescreen monitors that can do their own scaling have an option to disable that function and allow all scaling to be handled by the video card alone. Mine does not allow that so it always scales the image and the scaling it does do is rudimentary at best, leading to exceptionally bad image quality at anything other than its normally supported resolutions. The monitor obviously does a simple multiplication of pixels, while the video card actually does some interpretive calculations to improve the image quality while scaling. Its still just math, but not necessarily "simple math".
avatar
cogadh: The only way to get fixed aspect ratio scaling to work is to create a custom resolution that "tricks" the monitor into thinking it is already getting a 1680X1050 signal while only displaying the image in a properly scaled 4:3 section of the screen (its a complicated process).
That's exactly what I was talking about - use the drivers to scale, and the the monitor will only see the 1680x1050 signal. And it should not be an especially complicated process.

Do you have an NVIDIA card? If so, something is wrong, because that's exactly what the "fixed aspect ratio" scaling option in the NVIDIA control panel is supposed to do! It should scale the 1024x768 image to 1680x1050 (i.e. black bars surrounding a 1400x1050 image) and send that to the monitor. There is no way for the monitor to ignore the video card's scaling because it doesn't know that the original image was at 1024x768.

This is why I am saying the problem is in the video card or drivers, because there is no way for a monitor to screw it up in this manner - it just doesn't have the information to do so!

Can you post a screenshot of the driver settings you're using?
Post edited January 20, 2011 by kalirion
avatar
cogadh: The only way to get fixed aspect ratio scaling to work is to create a custom resolution that "tricks" the monitor into thinking it is already getting a 1680X1050 signal while only displaying the image in a properly scaled 4:3 section of the screen (its a complicated process).
avatar
kalirion: That's exactly what I was talking about - use the drivers to scale, and the the monitor will only see the 1680x1050 signal. And it should not be an especially complicated process.

Do you have an NVIDIA card? If so, something is wrong, because that's exactly what the "fixed aspect ration scaling" option in the NVIDIA control panel is supposed to do! It should scale the 1024x768 image to 1680x1050 and send that to the monitor. Can you post a screenshot of the driver settings you're using?
You don't seem to be understanding, so I guess I'll have to explain it in detail.

My monitor, regardless of the video card make or model or resolution settings, is capable of doing its own scaling. That function cannot be turned off. It does not matter what resolution the desktop or game is set at, the monitor will always scale it up to 1680X1050.

I do have an Nvidia card. Simply turning on fixed aspect ratio scaling in the driver control panel is not enough due to the way the monitor functions. I also think your understanding of fixed aspect ratio scaling is a little off. All that does is maintain the proper 4:3 ratio while scaling the image to fill the screen vertically. It does not change the resolution as reported to the monitor, so if a game is set at 1024X768, as far as the monitor is concerned, it is being sent a 1024X768 image, at which point my monitor's automatic scaling takes over and stretches the image to fill the rest of the 1680X1050 space.

In order to get around this problem, I have to turn on fixed aspect ratio scaling, then go into the custom resolutions and create a series of custom 4:3 resolutions, one for each of the common 4:3 types (640X480, 800X600, 1024X768, 1280X1024). In the "Create Custom Resolutions" applet, I leave my default resolution in place and change the timing method to "Manual". This allows all of the timing information, most importantly the "Active pixels" entry, to stay exactly as it should be for my default res. I then change the Display mode horizontal and vertical to the 4:3 res I want. Now, when a game tries to use a resolution like 1024X768 all 1680X1050 pixels remain active, tricking the monitor into not trying to scale the image. The video card takes care of scaling the image properly to fit the monitor without stretching it and the game looks great.
Some monitors are definitely weirder about scaling than others but every LCD in my experience is blurry to some degree at non-native resolutions. I don't think your monitor is special in any way, it is just worse at scaling than some others.

Though kalirion was wrong about what fixed aspect ration scaling means I think his core point was that nVidia has a setting where it scales the image for you and your monitor still receives a native resolution signal. He is right that this exists, but it is still a little blurry compared to using your real native res... anything scaled will always lose fidelity. It should be better than your monitor's scaling though if you think it's really bad.

In any event, for pixel-based games there is just no other option than a) deal with the non-native image blur or b) deal with everything being super small. There's nothing else to do really.
Actually, he was not right. Nvidia's fixed aspect ratio scaling only works if the monitor does not try to scale on its own. It does not send any kind of signal that tells the monitor "we're running at 1680X1050 but we're only going to use a 4:3 space in that res", it simply sends a 4:3 res that is scaled to fill the screen vertically, but the reported resolution from the monitor's perspective is still a 4:3 resolution, hence why my particular monitor's annoying habit of automatically scaling is a problem.
You're connected through DVI? And am I right in thinking Centered Timings(no scaling) also gets ignored by the LCD?
It's kinda weird to hear to Flat panel scaling would be enforced no matter what, normally the drivers override such a behaviour. Maybe there's something off with the EDID informations? Did the Panel come with a driver? I know that for some special resolutions some EDID tweaking can give very good results and opens up the accepted limits.
I've done it for some higher than native resolutions (I achieved a highest 5k something x res on a 1680x1050 native LCD although it would only work with a reduced frequency), you don't need AA anymore :P
Post edited January 20, 2011 by pops117
avatar
cogadh: Actually, he was not right. Nvidia's fixed aspect ratio scaling only works if the monitor does not try to scale on its own. It does not send any kind of signal that tells the monitor "we're running at 1680X1050 but we're only going to use a 4:3 space in that res", it simply sends a 4:3 res that is scaled to fill the screen vertically, but the reported resolution from the monitor's perspective is still a 4:3 resolution, hence why my particular monitor's annoying habit of automatically scaling is a problem.
That's not the way it's always worked for me. My monitor also has its own scaling, but If I have fixed aspect ratio scaling turned on, my monitor always receives its native resolution. When I play a 1024x768 game and go into the monitor menu, it shows that it's getting a 1920x1080 signal. When I switch the control panel back to "Use my display's built-in scaling", the monitor shows that it's getting a 1024x768 signal.

And if I turn on the "Do not scale" option, I get a perfectly crisp image in the middle, surrounded by black bars on all sides.
Post edited January 20, 2011 by kalirion
avatar
pops117: You're connected through DVI? And am I right in thinking Centered Timings(no scaling) also gets ignored by the LCD?
It's kinda weird to hear to Flat panel scaling would be enforced no matter what, normally the drivers override such a behaviour. Maybe there's something off with the EDID informations? Did the Panel come with a driver? I know that for some special resolutions some EDID tweaking can give very good results and opens up the accepted limits.
I've done it for some higher than native resolutions (I achieved a highest 5k something x res on a 1680x1050 native LCD although it would only work with a reduced frequency), you don't need AA anymore :P
Yes, connected via DVI. Yep, all scaling options are ignored by the monitor. No, it does not have drivers. This is a known issue with this particular model of monitor (Acer X223W). Apparently, previous revisions of the monitor had the ability to turn off the built-in scaling, but the current ones do not.
Post edited January 20, 2011 by cogadh