4K vs UHD



  • So I don't change the topic of http://mangolassi.it/topic/5302/tv-as-a-monitor/8 I'll ask my question here.

    @thecreativeone91 - any thoughts why film and TV folks haven't gotten on the same page when it comes to recording resolutions? I can understand 10+ years ago not doing it since normal consumers didn't have access to widescreen displays, but that problem is gone now. Displays can be made virtually any dimensions, so I don't understand the need or continuance of the split.



  • The nice thing about standards is that you have so many to choose from. - Andrew S Tanenbaum



  • @Dashrender Are you asking why movies are not shot in the same aspect ratio that TV's and Monitors use?



  • Umm, I think the question was why are TV's not being produced to match cinema standards? I know that television shows are typically produced differently than cinema, but again, why can't that be standardized? Now that everything is going digital, is there any real prohibitive expense holding up a shift?



  • Now that cinema is the back runner and TV or non-cinema content is the primary source used, and since cinema has long had many different formats while TV only had a few, maybe it is cinema that should get off their high horse and use the existing standards.



  • I'm betting that UHD is more for television broadcast, and is more easily understood by the consumer. HD they know now, and UHD is better because it's "ultra". 4K is already accepted nomenclature among the nerds and gamers, and I think we'll continue to see that used along with specific resolutions like 2160p.



  • @Nic said:

    I'm betting that UHD is more for television broadcast, and is more easily understood by the consumer. HD they know now, and UHD is better because it's "ultra". 4K is already accepted nomenclature among the nerds and gamers, and I think we'll continue to see that used along with specific resolutions like 2160p.

    So, you're basically saying that "UHD" and "4K" is the "organic" of nerd culture. I agree.



  • I honestly think having anything other than a straight resolution is a problem. Having names for resolutions and colour settings like "VGA" made sense in the 1980s. Not sure it does today. All of those names appear to exist for the purpose of misleading and not informing. And unless you have a certifying body that makes the standards and enforces them, they are just loose terms. HD, UHD.... people have been using those things for decades. We were calling things HD in the 80s, I'm sure they were in the 60s too. But suddenly it meant 480p to some people and 720p to others and then those were considered lies and only 1080p was HD. But some people thought 720i could be HD, but not others.

    None of these terms are useful.



  • @scottalanmiller said:

    I honestly think having anything other than a straight resolution is a problem.

    Sounds a little homophobic...



  • @scottalanmiller There's a whole lot of terms, and every single one is standardized, it's just that the general public doesn't know what each one means (but isn't that always the case?), and neither do people in marketing for said products. 720p and 720i are both HD, 1080p and 1080i are FHD (Full High Definition). 4K is actually a 16:10 at 4096 x 2160, which is not the consumer standard for "4K", which is actually 3840 x 2160, and is technically called UHD.



  • I agree Scott - HD and UHD are non-specific marketing terms. HD for consumer broadcast can mean 720p or 1080p, and I'm assuming UHD will have some equivalent fuzziness.



  • @Mike-Ralston said:

    @scottalanmiller There's a whole lot of terms, and every single one is standardized, it's just that the general public doesn't know what each one means (but isn't that always the case?), and neither do people in marketing for said products. 720p and 720i are both HD, 1080p and 1080i are FHD (Full High Definition).

    Those are later changes to pre-existing terms. HD is not something that can be standardized since the term vastly predates any "after the fact" standardization. This is a misuse of the term standardize. HD, even in the 480 era was old, and was used differently by everyone.

    Who, exactly, do you feel owns these terms to standardize them? 4K might have an owner as that is a new term AFAIK, but HD is an old one and does not have an owner.



  • @Mike-Ralston said:

    4K is actually a 16:10 at 4096 x 2160, which is not the consumer standard for "4K", which is actually 3840 x 2160, and is technically called UHD.

    According to @thecreativeone91 UHD and 4K are different standards. And UHD can't be standardized due to the nature of its name (again, just a loose, long running description.) And he works in commercial video so probably knows.

    HD is like HA. It's a moving target. You can't legitimately even call 1080p HA today, it is, at best, standard definition. It's not high by any standard, low by many, but still owns a lot of the mainstream.



  • I thought normal tv programs were still broadcast in 720p due to bandwidth.



  • I agree with Scott, the names instead of the actual resolution is no longer useful. I'm sure its only used since sales can use it to confuse and over sell to consumers.



  • @scottalanmiller I don't know who set the standards, probably FCC, ISO, EGA, etc., but they are indeed standards, and have been since they were conceived... Marketing teams may use them incorrectly, but that doesn't change that HD is a standard, as is HD+, FHD, UHD, UXGA, and so on and so forth.



  • @Dashrender said:

    I thought normal tv programs were still broadcast in 720p due to bandwidth.

    "Normal TV" is something that has become a backwater. But many, many years ago the last time that I looked into it, even rural programs out here in the middle of nowhere were broadcast higher than that commonly. And in Spain I know that over the air is on 4K.



  • @Mike-Ralston said:

    Marketing teams may use them incorrectly, but that doesn't change that HD is a standard, as is HD+, FHD, UHD, UXGA, and so on and so forth.

    Agreed



  • @Mike-Ralston said:

    @scottalanmiller I don't know who set the standards, probably FCC, ISO, EGA, etc., but they are indeed standards, and have been since they were conceived...

    This is completely untrue. The names predate those things. Some, like 4K might have a ratifying body, HD, simply, cannot. There is no way to standardize a name of that sort. It predates the FCC. It cannot be standardized as it is an English term that means something.



  • @Mike-Ralston said:

    Marketing teams may use them incorrectly, but that doesn't change that HD is a standard, as is HD+, FHD, UHD, UXGA, and so on and so forth.

    Not does marketing claiming that something is a standard make it so. HD simply predates the standard. We had HD when I was little and no one had dreamed up 480p yet.



  • @Mike-Ralston said:

    @scottalanmiller I don't know who set the standards, probably FCC, ISO, EGA, etc., but they are indeed standards, and have been since they were conceived... Marketing teams may use them incorrectly, but that doesn't change that HD is a standard, as is HD+, FHD, UHD, UXGA, and so on and so forth.

    According to Wikipedia, even 4K is not a standard, just a loose group of standards or assumptions. Some people have made some standards, but making "a" a standard is the opposite of the term being "a" standard. Anything near 4K lines of resolution is a 4K according to the page.

    So even the terms that are new and could, in theory, be standards, are not:

    https://en.wikipedia.org/wiki/4K_resolution



  • DCI 4K should not be confused with ultra-high-definition television (UHDTV) AKA "UHD-1", which has a resolution of 3840 x 2160 (16:9, or approximately a 1.78:1 aspect ratio). Many manufacturers may advertise their products as UHD 4K, or simply 4K, when the term 4K is traditionally reserved for the cinematic, DCI resolution.[3][4] This often causes great confusion among consumers.

    Wikipedia.

    This is interesting that the "standard" use of 4K is for something that should be 2K or just 2160p.



  • 2160p/i (what people call UHD), 1080p/i, 720p/i are all at 16:9 resolution, these specs are made by the NAB for broadcast. (P is progressive, I is interlaced). The term "UHD" is not a spec, is a term made by CES for marketing, really all it means is higher than 1080 HD.

    Broadcast standards have to be made (NAB) then approved (FCC). Currently this is an MPEG2 stream (slightly better than DVD Quality). NAB wants this to change by late 2016 to a H.265 HEVC codec for a 2160p stream (UHD). This would mean that your 1080p/720p would now need a converter box to decode the H.265 stream into MPEG2, that the TV knows how to decode.

    Film Standards are made by DCI, they aren't approved by anyone (There's no transmission). They just become accepted standards. These will likely never be the same for a number of reasons. One of them being incompatible aspect ratios. It's easy to put 16:9 or 4:3 on a 16:9 TV. or modify a CinemaScope aspect (there are a few different ones) to fit on a 16:9 Display. Fitting a 16:9 on a film aspect would give you much more wasted space as films are shot with a wider aspect than TV.



  • UHD is standardized only in Japan, it looks like, but it is still an English language description that can't be a standard, in general terms.



  • @Dashrender said:

    I thought normal tv programs were still broadcast in 720p due to bandwidth.

    It's depends on the broadcaster they can easily do 1080p and many are here. but the secondary channels tend to be 720p. The hardware encoders required are much more costly at 1080 vs 720. Just as Progressive is more expensive than interlaced.



  • @scottalanmiller said:

    Not does marketing claiming that something is a standard make it so. HD simply predates the standard. We had HD when I was little and no one had dreamed up 480p yet.

    Just because people said high-definition, those are just words people use to describe it. High-Definition is now a standard, and since the resolution was developed, it has been called as such... And regardless of WHEN the terms were coined, they're still the industry standard, and the lines are very clear as to what term applies to what resolution. The only time that this isn't clear, is when people haven't been educated about the proper terms, which is perfectly fine, but the industry still uses accepted standards. Everything in the electronics industry is carefully categorized, display resolutions are the same way. And, as always, marketing is usually not correct, and popular belief tends to be based on what someone saw on Facebook, or a TV advertisement.



  • @scottalanmiller said:

    UHD is standardized only in Japan, it looks like, but it is still an English language description that can't be a standard, in general terms.
    UHD is the standard for 3840 x 2160, globally. Marketing teams call it 4K, and then suddenly the lines are blurred. It isn't 4K, it's UHD.



  • @Mike-Ralston said:

    Just because people said high-definition, those are just words people use to describe it. High-Definition is now a standard

    High Definition is not a standard. all it means is resolutions higher than standard definition. It's just a umbrella marketing term that really 4k and UHD both could fall under.



  • @Mike-Ralston said:

    Just because people said high-definition, those are just words people use to describe it. High-Definition is now a standard, and since the resolution was developed, it has been called as such...

    Maybe it is a standard to YOU but it is not what people accept as a "standard." It's just a loose term that some people claim means one thing and others claim means another. It's an English description and not eligible to be a true standard, ever. This is just how language works. That you feel it is a standard or someone hopes that it will become one or people feel like it has been back-ported to one simply doesn't make it so. It is not a standard and claiming it is doesn't change that. The term predates the standard, you can't reverse these things.

    Cloud Computing is different. The term was defined before first use and has an owner. It's a specific term that did not exist prior to use as it is used today and that's why there is an official (in the US) standard for it.

    HD is just wishful thinking. We had HD before there was any thought of there being a standard and calling things HD now that they are obviously not is really no longer correct by any stretch. 720p, for example, cannot even remotely be considered higher than normal definition.



  • @Mike-Ralston said:

    @scottalanmiller said:

    UHD is standardized only in Japan, it looks like, but it is still an English language description that can't be a standard, in general terms.
    UHD is the standard for 3840 x 2160, globally. Marketing teams call it 4K, and then suddenly the lines are blurred. It isn't 4K, it's UHD.

    Except 4K is just a marketing term too. I think you are misunderstanding the concept of a standard. There is very obviously no standard for any of these terms. Different groups (consumers, cinema, manufacturers, marketers, television broadcasters, countries) use them differently and it is accepted.

    If there was actually a standard this would violate legal marketing and you'd be able to sue for labels being wrong. You cannot and why? Because they are not standards.

    You can say 4K is one thing, but most of the market says it is something different. You can't go to court about that because 4K is just a marketing term, not a standard.


Log in to reply