4K vs UHD
-
@Mike-Ralston said:
@thecreativeone91 Maybe broadcast is more locked down in the US, but with Digital Broadcasting, a network can stream at the resolution and aspect ratio that it wants. The TV receiving it will downsample or stretch the image, but that's up to the hardware on the user end. And HD does not refer to Interlaced Scan video, only Progressive Scan. Interlaced is used for TV broadcasting, and the occasional piece of professional equipment, most everything else is progressive scan.
This use of the 'p' was my understanding.
-
@scottalanmiller said:
@Mike-Ralston said:
And they understand perfectly that FHDi is 1920 x 1080 Interlaced, it's pretty simple.
What makes that simple, logical or even remotely likely? Have you tried Googling that term? I've never heard of it and Google seems pretty stumped too. I have a feeling that's a new one created here, once Google gets through this thread, maybe it will show up.
1920x1080 interlaced isn't enough to create a useful standard. The terms you are using aren't ones that the people you think need them could use. Those people do have standards, but they are not ones sold to consumers or end users.
And "standards" between business partners doesn't rely on people coming up with marketing names. They talk is detailed standards at a lower level.
Sure if you call it full HD interlaced that's what FHDi would mean so yes that would mean 1920x1080 interlaced. It's not that simple though. There are multiple flavors of HD. That only referees to the resolution.
Progressive/Interlaced and Frame rate are all separate things. You could say that HD 1920x1080 Interlaced is likely going to be 30fps as well, but it does not have to be nor is it a standard. -
Has anyone seen or heard of 4K interlaced yet?
-
@thecreativeone91 said:
@Mike-Ralston said:
@Dashrender There's your answer. Film is done to look good, TV is done to comply to stringent FCC rules.
The FCC only adopts the rules created by the broadcast associations. It would be impossible to do the level of a Film OTA. You get shipped multiple harddrives that create a raid to plug directly into the the digital projector both because of size and needed data rates to get high quality. Film is DPX files, one file per frame (it's actually just a large picture file with no compression). Audio is done separately, and synced with timecode.
LOL I heard that - it's basically a giant flip book.. LOL
Though I heard they are streaming or at least downloading that data now, is that not true?
-
@Dashrender said:
@Mike-Ralston said:
@thecreativeone91 Maybe broadcast is more locked down in the US, but with Digital Broadcasting, a network can stream at the resolution and aspect ratio that it wants. The TV receiving it will downsample or stretch the image, but that's up to the hardware on the user end. And HD does not refer to Interlaced Scan video, only Progressive Scan. Interlaced is used for TV broadcasting, and the occasional piece of professional equipment, most everything else is progressive scan.
This use of the 'p' was my understanding.
If you refer to something as 1080p or 720p yes then it would be progressive. But it doesn't have to be progressive. There is 1080i and 1080p. He also contradicted himself saying her that HD is only progressive and then later stating that it's understood full hd is 1080i.
-
@Dashrender said:
@thecreativeone91 said:
@Mike-Ralston said:
@Dashrender There's your answer. Film is done to look good, TV is done to comply to stringent FCC rules.
The FCC only adopts the rules created by the broadcast associations. It would be impossible to do the level of a Film OTA. You get shipped multiple harddrives that create a raid to plug directly into the the digital projector both because of size and needed data rates to get high quality. Film is DPX files, one file per frame (it's actually just a large picture file with no compression). Audio is done separately, and synced with timecode.
LOL I heard that - it's basically a giant flip book.. LOL
Though I heard they are streaming or at least downloading that data now, is that not true?
nope, all hard drives the only thing online about it is unlocking them, they have to be de-encrpyted per show to make sure theaters don't to unlicensed showings, but using them at slots they didn't pay for. That used to be a common thing in the 35mm/S35mm film days, which wasn't that long ago.
-
@Dashrender said:
@Mike-Ralston said:
@thecreativeone91 Maybe broadcast is more locked down in the US, but with Digital Broadcasting, a network can stream at the resolution and aspect ratio that it wants. The TV receiving it will downsample or stretch the image, but that's up to the hardware on the user end. And HD does not refer to Interlaced Scan video, only Progressive Scan. Interlaced is used for TV broadcasting, and the occasional piece of professional equipment, most everything else is progressive scan.
This use of the 'p' was my understanding.
Yes, the need for i and p was because HD did not originally imply either. If you HD in the English way, there is nothing interlaced that is truly HD. But if you use it the marketing way, you must specify as low definition interlacing is common.
-
@scottalanmiller said:
@thecreativeone91 said:
Granted Hulu does this I think.
Yeah, they suck.
Exactly - why would anyone want Hulu? I suppose if you could ditch cable and just do Hulu.. that might be OK.
-
@Dashrender said:
Exactly - why would anyone want Hulu? I suppose if you could ditch cable and just do Hulu.. that might be OK.
Because they have exclusive show rights.
-
@MattSpeller said:
Has anyone seen or heard of 4K interlaced yet?
Hopefully no one will use it. high end cameras shooting which are the only ones doing true 4k, wouldn't offer interlaced as you'd be insane to use it. I would not be surprised to see UHD 2160i at some point even though Interlacing sucks, especially now days. was really meant for CRT days.
-
@thecreativeone91 said:
@Dashrender said:
@Mike-Ralston said:
@thecreativeone91 Maybe broadcast is more locked down in the US, but with Digital Broadcasting, a network can stream at the resolution and aspect ratio that it wants. The TV receiving it will downsample or stretch the image, but that's up to the hardware on the user end. And HD does not refer to Interlaced Scan video, only Progressive Scan. Interlaced is used for TV broadcasting, and the occasional piece of professional equipment, most everything else is progressive scan.
This use of the 'p' was my understanding.
If you refer to something as 1080p or 720p yes then it would be progressive. But it doesn't have to be progressive. There is 1080i and 1080p. He also contradicted himself saying her that HD is only progressive and then later stating that it's understood full hd is 1080i.
Letting go of the last part - I've heard, though never really cared - that HD is either 720p or 1080i or 1080p, but HD could never be used to describe 720i. Don't ask me where I heard that, but I have heard it for years.
I'll have to double check, but I'm pretty sure my local cable company is transmitting at 720p (at least that is what my TV is telling me when I turn it on). I should call and ask if the box can be bumped to 1080p - and if it would make any real difference is the cable company is limiting itself to 720p anyway.
-
@thecreativeone91 said:
@Dashrender said:
@thecreativeone91 said:
@Mike-Ralston said:
@Dashrender There's your answer. Film is done to look good, TV is done to comply to stringent FCC rules.
The FCC only adopts the rules created by the broadcast associations. It would be impossible to do the level of a Film OTA. You get shipped multiple harddrives that create a raid to plug directly into the the digital projector both because of size and needed data rates to get high quality. Film is DPX files, one file per frame (it's actually just a large picture file with no compression). Audio is done separately, and synced with timecode.
LOL I heard that - it's basically a giant flip book.. LOL
Though I heard they are streaming or at least downloading that data now, is that not true?
nope, all hard drives the only thing online about it is unlocking them, they have to be de-encrpyted per show to make sure theaters don't to unlicensed showings, but using them at slots they didn't pay for. That used to be a common thing in the 35mm/S35mm film days, which wasn't that long ago.
I know our local theaters had employee showings usually a day or two beforehand.. those days are gone.
-
@scottalanmiller said:
@Dashrender said:
Exactly - why would anyone want Hulu? I suppose if you could ditch cable and just do Hulu.. that might be OK.
Because they have exclusive show rights.
It's actually worse than cable is some respects, because I can't fastforward through commercials.
-
@Dashrender said:
Letting go of the last part - I've heard, though never really cared - that HD is either 720p or 1080i or 1080p, but HD could never be used to describe 720i. Don't ask me where I heard that, but I have heard it for years.
Well hd would be resolution only. so it could be 720i, 720p, 1080i or 1080p. But the reality is 720i is almost never used. 720i is not a broadcast standard either.
-
@thecreativeone91 said:
@Dashrender said:
Letting go of the last part - I've heard, though never really cared - that HD is either 720p or 1080i or 1080p, but HD could never be used to describe 720i. Don't ask me where I heard that, but I have heard it for years.
Well hd would be resolution only. so it could be 720i, 720p, 1080i or 1080p. But the reality is 720i is almost never used. 720i is not a broadcast standard either.
Well that's probably where these sales guys get it from... it's not broadcast.. so the chances of seeing it are low... so we'll just not include it.
-
@Dashrender said:
Letting go of the last part - I've heard, though never really cared - that HD is either 720p or 1080i or 1080p, but HD could never be used to describe 720i. Don't ask me where I heard that, but I have heard it for years.
They were calling 480p "HD" back when we were doing home theatre installs with it before the last generation of HD terminology rolled around. 480p was part of the same spec group as 720i, 720p, 1080i and 1080p once upon a time.
-
@scottalanmiller said:
@Dashrender said:
Letting go of the last part - I've heard, though never really cared - that HD is either 720p or 1080i or 1080p, but HD could never be used to describe 720i. Don't ask me where I heard that, but I have heard it for years.
They were calling 480p "HD" back when we were doing home theatre installs with it before the last generation of HD terminology rolled around. 480p was part of the same spec group as 720i, 720p, 1080i and 1080p once upon a time.
They call it EDTV today.
-
This post is deleted! -
For me, this type of technology, it is all about the specs given. If I'm going to buy a 4k TV, I'm going to want to know the resolution. It doesn't matter to me if you call it 4k, UHD, FHD, or SupercalifragilisticHD.... (God, please, don't make me type that again...)... If the resultion is only 320 x 240 on my shiney new 65" TV... I think I may have a problem with picture quality.
However, on a TV with higher resolutions, there comes a point where the human eye won't be able to tell a difference between the various resolutions... 4096 x 2160 and 3840 x 2160 could be a good example... Can the human eye even detect a single pixel at that level on a 55 inch screen?
-
@dafyre No no no, there comes a point where resolution is high enough and you can make a bigger TV/Monitor or sit closer