It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
Well he did say it was a tangent, but beyond that, I didn't bother to read all of the details, because I was just wondering about the TV, in the first place, so, whatever.
I'll advise you against purchasing it. The brand Vizio is sold almost exclusively at Wal-Mart and Best Buy. They are terrible televisions for any purpose, being both poorly made (having terrible processors and memory for the smart features), and having a short use life (around 3 years with the 2 TVs I had purchased). I'm speaking as someone who tried to go 4K the budget route with this very budget brand. Save your money, and get a trusted brand. A good 65 inch 4K tv should cost around $1,000 or more. I'd recommend LG or Sony. You really do get what you pay for in the electronics market. My PC monitor (27 inch 240hz 1440p G-Sync Ultimate) was around $1,200 at time of purchase for a reference point.

I also wouldn't recommend most tvs for PC gaming either. Take for example, this one.....
4K...means you are stuck with either a scaled 480p, a stretched blurry scaled 720p, an evenly scaled but somewhat decent 1080p or 4k at 60hz...nothing in between as most 4K tvs do not support 1440p(it'd be unevely scaled even if it did anyway).
Freesync......but with a stock 60fps...........kinda useless due to freesync cutting out at around 45fps
latency....latency...latency
Post edited November 04, 2024 by RizzoCuoco
avatar
RizzoCuoco: ...4K...means you are stuck with... ...a stretched blurry scaled 720p...
1280x720 is exactly one ninth of 3840x2160, so the only way any 4K TV could somehow manage to screw that up is with older systems where you can't enable GPU scaling and thus would have to rely on whatever scaling support the TV has.
Well I didn't mean to imply that I was considering buying a TV, but only that I happened to see it and was curious what it is, but I'll keep that in mind the next time that I do want to buy one. Thanks.

But I'm not sure why it couldn't easily be compatible with all of those resolutions, and I don't see why a TV wouldn't want to go out of its way to be usable as a computer monitor, because after all, a lot of people do use them for that purpose.
avatar
RizzoCuoco: ...4K...means you are stuck with... ...a stretched blurry scaled 720p...
avatar
JAAHAS: 1280x720 is exactly one ninth of 3840x2160, so the only way any 4K TV could somehow manage to screw that up is with older systems where you can't enable GPU scaling and thus would have to rely on whatever scaling support the TV has.
yup...exactly one ninth...thus 4k is not evenly divisable...thus not a good scale, whereas 1080p is one fourth which is evenly divisible. That's exactly how scaling works, my friend. We were not talking about all 4k tvs...we we're talking about this specific one. Reread again where I said I don't recommend most TVs. Whether you use GPU or monitor scaling....there are parameters and limitations. 1440p is really the only resolution where 720p looks correct because it is evenly divisible. Same way with 4k and 1080p. Obviously, you are dealing with mismatched scaling factors if you try to put a 720 res on 4k. it will always look off. Which means that a single 720p pixel would need to be displayed across nine 4K pixels to make the 720p image fill the 4K panel/screen. With a 1080p source, a single pixel would need to be displayed across four 4K UHD pixels to be able to fill the screen.
Post edited November 04, 2024 by RizzoCuoco
avatar
HeresMyAccount: Well I didn't mean to imply that I was considering buying a TV, but only that I happened to see it and was curious what it is, but I'll keep that in mind the next time that I do want to buy one. Thanks.

But I'm not sure why it couldn't easily be compatible with all of those resolutions, and I don't see why a TV wouldn't want to go out of its way to be usable as a computer monitor, because after all, a lot of people do use them for that purpose.
Well...simply put...a TV is not a monitor. They have a set range of resolutions...and usually very few to choose from. I'd argue that they are made for console players. I don't know too many PC players who would bother. Try doing anything computer related on a 4k tv. Good luck being anyways productive at 65 inches. I'm speaking from experience. A TV does not make a monitor substitute. Nevermind what it will do to your vision with all the fatigue.
Post edited November 04, 2024 by RizzoCuoco
system double posted.
Post edited November 04, 2024 by RizzoCuoco
avatar
HeresMyAccount: and I don't see why a TV wouldn't want to go out of its way to be usable as a computer monitor
Sorry for the cropped quote but I'll throw my 2 cents regarding the above.
RizzoCuoco edited the post regarding productivity, so I'll refrain from dispute it :)

A TV can be a perfectly fine monitor replacement, depending on what's used for.
That said, most TV's have a big lag between the frame received and display the actual frame, even if one disable every kind of post processing, the lag can be around 1 second. In this situation, even navigating with the mouse pointer is dificult. In my very limited personal experience, the more features the TV has, worse the latency is.

Also, most GPU's on the market don't support HDMI CEC for some reason and TV do not turn on automatically upon recive a HDMI signal, unlike monitors. That means the computer cannot turn the TV on and it must be done manually, depending on the use case, this gets boring pretty fast, specially for me as I really dislike remotes.
Not helping is the fact that each TV vendor, implement CEC however they like, giving trade names. Most work fine with standard commands, though.

Some rare x86 computers do support CEC (a few Intel NUC's, after being asked to implement it for years) and many ARM boxes do as well, including all the Raspberry Pi's. I don't know a single mainstream dedicated GPU that implements CEC though.

I do have a laptop connected to the TV via the VGA port with a HDMI adapter. It does turn on every time the computer sends the signal but I'll assume is quite rare to find a TV nowadays with "computer port/VGA" input.
avatar
HeresMyAccount: and I don't see why a TV wouldn't want to go out of its way to be usable as a computer monitor
avatar
Dark_art_: Sorry for the cropped quote but I'll throw my 2 cents regarding the above.
RizzoCuoco edited the post regarding productivity, so I'll refrain from dispute it :)

A TV can be a perfectly fine monitor replacement, depending on what's used for.
That said, most TV's have a big lag between the frame received and display the actual frame, even if one disable every kind of post processing, the lag can be around 1 second. In this situation, even navigating with the mouse pointer is dificult. In my very limited personal experience, the more features the TV has, worse the latency is.

Also, most GPU's on the market don't support HDMI CEC for some reason and TV do not turn on automatically upon recive a HDMI signal, unlike monitors. That means the computer cannot turn the TV on and it must be done manually, depending on the use case, this gets boring pretty fast, specially for me as I really dislike remotes.
Not helping is the fact that each TV vendor, implement CEC however they like, giving trade names. Most work fine with standard commands, though.

Some rare x86 computers do support CEC (a few Intel NUC's, after being asked to implement it for years) and many ARM boxes do as well, including all the Raspberry Pi's. I don't know a single mainstream dedicated GPU that implements CEC though.

I do have a laptop connected to the TV via the VGA port with a HDMI adapter. It does turn on every time the computer sends the signal but I'll assume is quite rare to find a TV nowadays with "computer port/VGA" input.
Agreed...and well said. It's doable....just not preferable or optimal.
avatar
JAAHAS: 1280x720 is exactly one ninth of 3840x2160, so the only way any 4K TV could somehow manage to screw that up is with older systems where you can't enable GPU scaling and thus would have to rely on whatever scaling support the TV has.
avatar
RizzoCuoco: yup...exactly one ninth...thus 4k is not evenly divisable...thus not a good scale, whereas 1080p is one fourth which is evenly divisible. That's exactly how scaling works, my friend. We were not talking about all 4k tvs...we we're talking about this specific one. Reread again where I said I don't recommend most TVs. Whether you use GPU or monitor scaling....there are parameters and limitations. 1440p is really the only resolution where 720p looks correct because it is evenly divisible. Same way with 4k and 1080p. Obviously, you are dealing with mismatched scaling factors if you try to put a 720 res on 4k. it will always look off. Which means that a single 720p pixel would need to be displayed across nine 4K pixels to make the 720p image fill the 4K panel/screen. With a 1080p source, a single pixel would need to be displayed across four 4K UHD pixels to be able to fill the screen.
How exactly is it not evenly divisible? 9 is 3x3, each 1280x720 pixel equals 3x3 4K pixels, no different from each HD pixel being 2x2 4K pixels.
I dunno...i just make stuff up. Evenly divisible might be the wrong word choice. The point is...1 pixed stretched and dsitorted across 9 pixels is nothing like one pixel stretched across 4. The picture becomes hazy and loses contrast. That's simply just how it works. 720p scaled looks far better on a 1440p res. Let's correct the original statement and say yes indeed that the pixel that is upscaled from 720 is evenly distributed but also...it gets stretched to a far greater degree making it blurry and out of focus.
Post edited November 05, 2024 by RizzoCuoco
Since people provided answers and discussions already, I'd say we'd need to be more worried with modern smart TVs having cameras and mics ready to collect every single piece of possible produced human data...

Oh how we love the modern world of IoT, Alexas and the like. So many services for Free! *cof cof* (While they collect our data) But its free! :P
avatar
RizzoCuoco: ...The picture becomes hazy and loses contrast. That's simply just how it works...
Only if the source signal is not integer scaled before it is sent to the TV/monitor, in which case it all depends on how the scaling is done by different manufacturers and models.
avatar
.Keys: Since people provided answers and discussions already, I'd say we'd need to be more worried with modern smart TVs having cameras and mics ready to collect every single piece of possible produced human data...

Oh how we love the modern world of IoT, Alexas and the like. So many services for Free! *cof cof* (While they collect our data) But its free! :P
My last TV died and I bought a replacement. The single factor that sold it was that it had a hardware switched microphone that I could turn off permanently without fear of remote activation.
avatar
amok: I am very confused. Do you have some kind of point with all this, or is it all just mental masturbation?
Such vulgarity. Why may I not allowed to contribute to this vitally important and essential conversation? Because you deem my contribution worthless? Careful with that metric, you may come to regret that. Sauce for the goose, and all that.

I could just point out the projection but I will answer your question. (In case you didn't learn it from my response, where I told you, directly.)
avatar
amok: edit. And just to remind you, because you seem to have missed it, in your own definition of gamut, the first point you have there is:

"1 the complete range or scope of something: [...]"

so the complete range of color being shown on a digital output is baked into your own dictionary quote. If you do not have anything productive to add, then that is that. But do feel free to indulge in some more navel-gazing
I am a semeiotician. Words are symbols, whose meanings, by definition, can and almost always do change and our minds are symbolic processors that have evolved in tandem with the languages we speak. (I don't care about the particulars of a specific word, more the ebb and flow of meaning: the Variability of Big Data, if you will indulge my metaphor.)

My contribution was the root of the word. I had no intention of replacing your definition, merely adding some context. Your contribution was the current definition (without any citation to allow for others to independently confirm your assertion, which you might look to correct in future: just satin') that was in agreement, as you noted in your edit, with what I stated. (Shock, horror, the current definition is related to the original sense. I must caution, however, because this is not necessarily universal. Hence my post.)

You may find my contribution unnecessary —— that's your right in a free exchange of ideas, and I will fight to the death (no exaggeration) for your right to do so —— but I also don't pay much heed to your opinion, since I have an internal frame of reference (perhaps you look for too much validation from others?) with a healthy connection to empirical reality.

My point, as I noted, was to extract some more detail from you. (You are a smart fellow; it's a shame you spend so much time sniping beneath your abilities. It is a poor game unworthy of your attentions.) I succeeded in prompting you to provide some more to the conversation (even if it was not sufficient to reach a standard of excellence) and thus I have no more to contribute. Unless you wish to discuss something else?
avatar
scientiae: [...]
It is 'semiotician,' not 'semeiotician.'

Words and languages are symbolic, not symbols (there is a difference).

Investigating the root of a word is etymology, not semiotics (which is the study of signs).

Claiming to be interested in semiotics and thinking 'mental masturbation' is vulgar is laughable, especially since you just talked about words as symbolic. (Just google it. You great 'semeiotician' know that cultural use of a sign infer its meaning, so....)

May I also point out from a previous point that humans do not have 'symbolic minds.' We have minds that can interpret symbols and deal with concepts, but they are far from symbolic and can deal just as well with iconic signs.

You basically have no idea what you are talking about, and if I were to be as vulgar as you have claimed me to be, I would say that your contribution so far has been shit (and that is a symbol in itself).

If your next post is just as vapid, then I will leave it there