4K vs UHD
-
@thecreativeone91 said:
(snip)
There's also some HDTVs (1920x1080) that will want a 1024x768 signal over the VGA/DVI ports for some reason.Right. I wasn't thinking about the early HDTV's or projectors. But generally it doesn't matter what resolution is negotiated between the PC and the device... the PC is the one responsible for sending the video output and scaling it to the current resolution.
Edit: Sometimes it works well, and others, it totally sucks.
-
OMG! You just said "HDTV" and "1366x768" in the same sentence!?!?!?!
-
I agree with dafyre, I don't see any other way.
How does Europe handle these updating Codex?
The US and possible world had it pretty easy for the first 50 years of TV, very little change, but now, there seems to be constant change - but the devices are to expensive to make this change globally (in the US at least) so OTA has to be very slow to change, and well, cable providers will have to continue to update their boxes on a regular basis to be able to handle the changes - which seems unlikely, so instead they'll just change the single at their DC, before pushing it to the end user boxes, and we the consumers end up with old codex for a while.
Internet broadcasts where you the consumer controls what set top box you have, that will give you the most up to date options, as it always has. Of course, this means that the vendors, think Hulu and Netflix, have to update themselves as well to gain any advantages of the new codex.
-
@art_of_shred said:
OMG! You just said "HDTV" and "1366x768" in the same sentence!?!?!?!
I think his emphasis was on the earlier models.
-
@Dashrender said:
I agree with dafyre, I don't see any other way.
How does Europe handle these updating Codex?
The US and possible world had it pretty easy for the first 50 years of TV, very little change, but now, there seems to be constant change - but the devices are to expensive to make this change globally (in the US at least) so OTA has to be very slow to change, and well, cable providers will have to continue to update their boxes on a regular basis to be able to handle the changes - which seems unlikely, so instead they'll just change the single at their DC, before pushing it to the end user boxes, and we the consumers end up with old codex for a while.
Well the rules apply to broadcast TV. Meaning the End users are responsible for them. With Cable it's much different and they typically do not adopt the latest right away.
But the EU has had SCART for many years allowing the TV to just be a TV and this was a bi-directional connection between TVs and set-top boxes. Of course now you'd use HDMI or something but they've had this since like the 1970s
-
@Dashrender said:
so OTA has to be very slow to change, and well, cable providers will have to continue to update their boxes on a regular basis to be able to handle the changes
It's the opposite OTA has deadlines and is usually higher quality than much of the Cable systems. Cable TV Transmits in a million different ways, very few do HD over Coax. It's usually in some form of digital encoded signal to a proprietary system they use or IPTV.
-
@thecreativeone91 said:
@Dashrender said:
so OTA has to be very slow to change, and well, cable providers will have to continue to update their boxes on a regular basis to be able to handle the changes
It's the opposite OTA has deadlines and is usually higher quality than much of the Cable systems. Cable TV Transmits in a million different ways, very few do HD over Coax. It's usually in some form of digital encoded signal to a proprietary system they use or IPTV.
Say what? Help me out there, cable doesn't broadcast 1080i/p or 720i/p over coax? I suppose they could be streaming IPTV to their set top box in my house... but all that comes over my coax. basically I'm asking for more explanation.
-
@Dashrender said:
Say what? Help me out there, cable doesn't broadcast 1080i/p or 720i/p over coax? I suppose they could be streaming IPTV to their set top box in my house... but all that comes over my coax. basically I'm asking for more explanation.
It is not raw signal on the coax. it is data.
-
@JaredBusch said:
@Dashrender said:
Say what? Help me out there, cable doesn't broadcast 1080i/p or 720i/p over coax? I suppose they could be streaming IPTV to their set top box in my house... but all that comes over my coax. basically I'm asking for more explanation.
It is not raw signal on the coax. it is data.
OH - you mean data that's intended to be processed by the set top box and then displayed on my TV.. ok gotcha.
-
Most cable boxes now are pretty much video processors to descramble the (often) encrypted data from the cable companies and feed it to your TV in recognizable chunks (this is what causes the artifcats / blocky looks from time to time).
-
Of course - and now that cable companies are dumping the analog signals too, all TVs will have to use a set top box to decode their digital signals.
We just had Cox out a few weeks ago to install the boxes on all of our TVs around the waiting rooms.
Is there no standard for transmitting non encrypted digital TV signals?
-
@Dashrender said:
Is there no standard for transmitting non encrypted digital TV signals?
ATSC you can do it in the same manor as OTA over a cable. They don't want to though.
-
@thecreativeone91 said:
@Dashrender said:
Is there no standard for transmitting non encrypted digital TV signals?
ATSC you can do it in the same manor as OTA over a cable. They don't want to though.
They don't so they can charge me more money? I'd just like to see the status quo stay where it is.. the basic channels and HBO 1, unencrypted.. the rest can stay like they have always been, requiring a set top box....
yeah yeah.. I know.. wish in one hand...
-
@Mike-Ralston said in 4K vs UHD:
@scottalanmiller There's a whole lot of terms, and every single one is standardized, it's just that the general public doesn't know what each one means (but isn't that always the case?), and neither do people in marketing for said products. 720p and 720i are both HD, 1080p and 1080i are FHD (Full High Definition). 4K is actually a 16:10 at 4096 x 2160, which is not the consumer standard for "4K", which is actually 3840 x 2160, and is technically called UHD.
These are all "opinions", not standards. That 1080p is called "Full HD" implies that 1080i, 720p, etc. fall short of being HD. Anything less than full is "partial". 11" can be called a partial foot, but that doesn't make it a foot. HD isn't a technical term, it's a useless relative description. Full HD is not HD today, it's less. 4K (UHD) is our "standard definition" today. Can't claim anything to be HD that isn't far beyond that.
Some things are actual standards. Other things like 4K and HD are just "words" that people use that have no standard or definite meaning behind them. Lots of people claim that they mean something definite, but they are meaningless terms and none of those people on their high horses who are sure "they" have the one standard, agree. It's just buzz words for marketing. Acting like they are a single standard is a bit pretentious when it's obvious that they are the very definition of non-standard. Everyone has their own idea of what it means in their minds and it doesn't match what anyone else things.
-
@Dashrender said in 4K vs UHD:
Is there no standard for transmitting non encrypted digital TV signals?
There "are" standards, as in there are protocol definitions. But no one has to use them and there is a lot of value to the vendors to do their own unique things.
-
@Mike-Ralston said in 4K vs UHD:
4K is actually a 16:10 at 4096 x 2160
Actually, even DCI, that make that projection defitinition, doesn't have a single standard and DCI 4K (which is not 4K, it's DCI 4K) isn't a standard, but a list of standards. So even by the book DCI 4K doesn't necessarily match what you claim to be a requirement. So even if you believe that DCI has the right to own the term 4K, you don't result in a 4096 pixel standard.