4K vs UHD
-
I see a huge potential problem in the coming months/years as there are more and more options. TVs will need to be flash updatable so they can be upgraded to support the ability to convert signals on the fly, or else we'll have people with tv sets that just aren't workable.
-
@Dashrender said:
I see a huge potential problem in the coming months/years as there are more and more options. TVs will need to be flash updatable so they can be upgraded to support the ability to convert signals on the fly, or else we'll have people with tv sets that just aren't workable.
I think it would be a good idea for TVs to be flashable, and many of the newere "smart" TVs are. However, I think it may make more sense to use set top boxes like folks do now for the HD Antennae, except instead of doing HDTV over the air, it will scale the video to fit the resolution of the TV it is connected to.
-
@Dashrender said:
I see a huge potential problem in the coming months/years as there are more and more options. TVs will need to be flash updatable so they can be upgraded to support the ability to convert signals on the fly, or else we'll have people with tv sets that just aren't workable.
Unlikely. all these encoders are hardware specific encoders for the type (ASIC) rather than generic Processing. It would also cut into their sales of newer TVs.
-
@Dashrender said:
....TVs will need to be flash updatable so they can be upgraded to support ....
Yesssss make them updatable... I can't think of anything evil I could possibly do with a wifi connected device with speakers and soon microphones in your livingroom....
Excuse me I need to go find a hairless cat and some henchmen.
-
@dafyre said:
@Dashrender Well that depends on how the signal is encoded... What happens if you try to play a 4096x2160 video on a 1920 x 1080 TV? If it his hooked up to a PC, then the PC handles the scaling of the video to the right resolution...
Depends on If the TV supports higher res inputs or not.
-
@dafyre said:
@Dashrender said:
I see a huge potential problem in the coming months/years as there are more and more options. TVs will need to be flash updatable so they can be upgraded to support the ability to convert signals on the fly, or else we'll have people with tv sets that just aren't workable.
I think it would be a good idea for TVs to be flashable, and many of the newere "smart" TVs are. However, I think it may make more sense to use set top boxes like folks do now for the HD Antennae, except instead of doing HDTV over the air, it will scale the video to fit the resolution of the TV it is connected to.
It's not just scaling it's codecs too. OTA with UHD will be H.265 where is it's currently MPEG2. A Setup box will be required for OTA. I don't think will see many upgradable ones, not unless they were designed with something specific in mind but the specs where still loosely defined.
-
@thecreativeone91 said:
@dafyre said:
@Dashrender Well that depends on how the signal is encoded... What happens if you try to play a 4096x2160 video on a 1920 x 1080 TV? If it his hooked up to a PC, then the PC handles the scaling of the video to the right resolution...
Depends on If the TV supports higher res inputs or not.
In the case of hooking a PC up to the TV, the PC determines the TV's resolution and sets it appropriately, and the PC handles the decoding / displaying of the video.
-
@thecreativeone91 said:
It's not just scaling it's codecs too. OTA with UHD will be H.265 where is it's currently MPEG2. A Setup box will be required for OTA. I don't think will see many upgradable ones, not unless they were designed with something specific in mind but the specs where still loosely defined.
Good point... More and more, though, I see us going backwards a bit... back to requiring the STBs (Set Top Boxes) to handle the decoding / scaling of the video, and the TV just being a TV again.
-
@dafyre said:
In the case of hooking a PC up to the TV, the PC determines the TV's resolution and sets it appropriately, and the PC handles the decoding / displaying of the video.
Kinda. It's based on EDID. If the TV allows higher resolution than the panel actually is there's nothing stopping you from sending it a higher resolution. Nor is the native resolution necessarily the normal input resolution.
For example some early HDTVs had a panel resolution of 1366x768, yet they would default to 1280x720 for PC input.
This is really common on projectors having a native resolution of say 800x600, 1024x768 but making the default resolution 1366x768 or higher.
There's also some HDTVs (1920x1080) that will want a 1024x768 signal over the VGA/DVI ports for some reason. -
@thecreativeone91 said:
(snip)
There's also some HDTVs (1920x1080) that will want a 1024x768 signal over the VGA/DVI ports for some reason.Right. I wasn't thinking about the early HDTV's or projectors. But generally it doesn't matter what resolution is negotiated between the PC and the device... the PC is the one responsible for sending the video output and scaling it to the current resolution.
Edit: Sometimes it works well, and others, it totally sucks.
-
OMG! You just said "HDTV" and "1366x768" in the same sentence!?!?!?!
-
I agree with dafyre, I don't see any other way.
How does Europe handle these updating Codex?
The US and possible world had it pretty easy for the first 50 years of TV, very little change, but now, there seems to be constant change - but the devices are to expensive to make this change globally (in the US at least) so OTA has to be very slow to change, and well, cable providers will have to continue to update their boxes on a regular basis to be able to handle the changes - which seems unlikely, so instead they'll just change the single at their DC, before pushing it to the end user boxes, and we the consumers end up with old codex for a while.
Internet broadcasts where you the consumer controls what set top box you have, that will give you the most up to date options, as it always has. Of course, this means that the vendors, think Hulu and Netflix, have to update themselves as well to gain any advantages of the new codex.
-
@art_of_shred said:
OMG! You just said "HDTV" and "1366x768" in the same sentence!?!?!?!
I think his emphasis was on the earlier models.
-
@Dashrender said:
I agree with dafyre, I don't see any other way.
How does Europe handle these updating Codex?
The US and possible world had it pretty easy for the first 50 years of TV, very little change, but now, there seems to be constant change - but the devices are to expensive to make this change globally (in the US at least) so OTA has to be very slow to change, and well, cable providers will have to continue to update their boxes on a regular basis to be able to handle the changes - which seems unlikely, so instead they'll just change the single at their DC, before pushing it to the end user boxes, and we the consumers end up with old codex for a while.
Well the rules apply to broadcast TV. Meaning the End users are responsible for them. With Cable it's much different and they typically do not adopt the latest right away.
But the EU has had SCART for many years allowing the TV to just be a TV and this was a bi-directional connection between TVs and set-top boxes. Of course now you'd use HDMI or something but they've had this since like the 1970s
-
@Dashrender said:
so OTA has to be very slow to change, and well, cable providers will have to continue to update their boxes on a regular basis to be able to handle the changes
It's the opposite OTA has deadlines and is usually higher quality than much of the Cable systems. Cable TV Transmits in a million different ways, very few do HD over Coax. It's usually in some form of digital encoded signal to a proprietary system they use or IPTV.
-
@thecreativeone91 said:
@Dashrender said:
so OTA has to be very slow to change, and well, cable providers will have to continue to update their boxes on a regular basis to be able to handle the changes
It's the opposite OTA has deadlines and is usually higher quality than much of the Cable systems. Cable TV Transmits in a million different ways, very few do HD over Coax. It's usually in some form of digital encoded signal to a proprietary system they use or IPTV.
Say what? Help me out there, cable doesn't broadcast 1080i/p or 720i/p over coax? I suppose they could be streaming IPTV to their set top box in my house... but all that comes over my coax. basically I'm asking for more explanation.
-
@Dashrender said:
Say what? Help me out there, cable doesn't broadcast 1080i/p or 720i/p over coax? I suppose they could be streaming IPTV to their set top box in my house... but all that comes over my coax. basically I'm asking for more explanation.
It is not raw signal on the coax. it is data.
-
@JaredBusch said:
@Dashrender said:
Say what? Help me out there, cable doesn't broadcast 1080i/p or 720i/p over coax? I suppose they could be streaming IPTV to their set top box in my house... but all that comes over my coax. basically I'm asking for more explanation.
It is not raw signal on the coax. it is data.
OH - you mean data that's intended to be processed by the set top box and then displayed on my TV.. ok gotcha.
-
Most cable boxes now are pretty much video processors to descramble the (often) encrypted data from the cable companies and feed it to your TV in recognizable chunks (this is what causes the artifcats / blocky looks from time to time).
-
Of course - and now that cable companies are dumping the analog signals too, all TVs will have to use a set top box to decode their digital signals.
We just had Cox out a few weeks ago to install the boxes on all of our TVs around the waiting rooms.
Is there no standard for transmitting non encrypted digital TV signals?