4K vs UHD
-
@Mike-Ralston said:
@scottalanmiller I don't know who set the standards, probably FCC, ISO, EGA, etc., but they are indeed standards, and have been since they were conceived...
This is completely untrue. The names predate those things. Some, like 4K might have a ratifying body, HD, simply, cannot. There is no way to standardize a name of that sort. It predates the FCC. It cannot be standardized as it is an English term that means something.
-
@Mike-Ralston said:
Marketing teams may use them incorrectly, but that doesn't change that HD is a standard, as is HD+, FHD, UHD, UXGA, and so on and so forth.
Not does marketing claiming that something is a standard make it so. HD simply predates the standard. We had HD when I was little and no one had dreamed up 480p yet.
-
@Mike-Ralston said:
@scottalanmiller I don't know who set the standards, probably FCC, ISO, EGA, etc., but they are indeed standards, and have been since they were conceived... Marketing teams may use them incorrectly, but that doesn't change that HD is a standard, as is HD+, FHD, UHD, UXGA, and so on and so forth.
According to Wikipedia, even 4K is not a standard, just a loose group of standards or assumptions. Some people have made some standards, but making "a" a standard is the opposite of the term being "a" standard. Anything near 4K lines of resolution is a 4K according to the page.
So even the terms that are new and could, in theory, be standards, are not:
-
DCI 4K should not be confused with ultra-high-definition television (UHDTV) AKA "UHD-1", which has a resolution of 3840 x 2160 (16:9, or approximately a 1.78:1 aspect ratio). Many manufacturers may advertise their products as UHD 4K, or simply 4K, when the term 4K is traditionally reserved for the cinematic, DCI resolution.[3][4] This often causes great confusion among consumers.
Wikipedia.
This is interesting that the "standard" use of 4K is for something that should be 2K or just 2160p.
-
2160p/i (what people call UHD), 1080p/i, 720p/i are all at 16:9 resolution, these specs are made by the NAB for broadcast. (P is progressive, I is interlaced). The term "UHD" is not a spec, is a term made by CES for marketing, really all it means is higher than 1080 HD.
Broadcast standards have to be made (NAB) then approved (FCC). Currently this is an MPEG2 stream (slightly better than DVD Quality). NAB wants this to change by late 2016 to a H.265 HEVC codec for a 2160p stream (UHD). This would mean that your 1080p/720p would now need a converter box to decode the H.265 stream into MPEG2, that the TV knows how to decode.
Film Standards are made by DCI, they aren't approved by anyone (There's no transmission). They just become accepted standards. These will likely never be the same for a number of reasons. One of them being incompatible aspect ratios. It's easy to put 16:9 or 4:3 on a 16:9 TV. or modify a CinemaScope aspect (there are a few different ones) to fit on a 16:9 Display. Fitting a 16:9 on a film aspect would give you much more wasted space as films are shot with a wider aspect than TV.
-
UHD is standardized only in Japan, it looks like, but it is still an English language description that can't be a standard, in general terms.
-
@Dashrender said:
I thought normal tv programs were still broadcast in 720p due to bandwidth.
It's depends on the broadcaster they can easily do 1080p and many are here. but the secondary channels tend to be 720p. The hardware encoders required are much more costly at 1080 vs 720. Just as Progressive is more expensive than interlaced.
-
@scottalanmiller said:
Not does marketing claiming that something is a standard make it so. HD simply predates the standard. We had HD when I was little and no one had dreamed up 480p yet.
Just because people said high-definition, those are just words people use to describe it. High-Definition is now a standard, and since the resolution was developed, it has been called as such... And regardless of WHEN the terms were coined, they're still the industry standard, and the lines are very clear as to what term applies to what resolution. The only time that this isn't clear, is when people haven't been educated about the proper terms, which is perfectly fine, but the industry still uses accepted standards. Everything in the electronics industry is carefully categorized, display resolutions are the same way. And, as always, marketing is usually not correct, and popular belief tends to be based on what someone saw on Facebook, or a TV advertisement.
-
@scottalanmiller said:
UHD is standardized only in Japan, it looks like, but it is still an English language description that can't be a standard, in general terms.
UHD is the standard for 3840 x 2160, globally. Marketing teams call it 4K, and then suddenly the lines are blurred. It isn't 4K, it's UHD. -
@Mike-Ralston said:
Just because people said high-definition, those are just words people use to describe it. High-Definition is now a standard
High Definition is not a standard. all it means is resolutions higher than standard definition. It's just a umbrella marketing term that really 4k and UHD both could fall under.
-
@Mike-Ralston said:
Just because people said high-definition, those are just words people use to describe it. High-Definition is now a standard, and since the resolution was developed, it has been called as such...
Maybe it is a standard to YOU but it is not what people accept as a "standard." It's just a loose term that some people claim means one thing and others claim means another. It's an English description and not eligible to be a true standard, ever. This is just how language works. That you feel it is a standard or someone hopes that it will become one or people feel like it has been back-ported to one simply doesn't make it so. It is not a standard and claiming it is doesn't change that. The term predates the standard, you can't reverse these things.
Cloud Computing is different. The term was defined before first use and has an owner. It's a specific term that did not exist prior to use as it is used today and that's why there is an official (in the US) standard for it.
HD is just wishful thinking. We had HD before there was any thought of there being a standard and calling things HD now that they are obviously not is really no longer correct by any stretch. 720p, for example, cannot even remotely be considered higher than normal definition.
-
@Mike-Ralston said:
@scottalanmiller said:
UHD is standardized only in Japan, it looks like, but it is still an English language description that can't be a standard, in general terms.
UHD is the standard for 3840 x 2160, globally. Marketing teams call it 4K, and then suddenly the lines are blurred. It isn't 4K, it's UHD.Except 4K is just a marketing term too. I think you are misunderstanding the concept of a standard. There is very obviously no standard for any of these terms. Different groups (consumers, cinema, manufacturers, marketers, television broadcasters, countries) use them differently and it is accepted.
If there was actually a standard this would violate legal marketing and you'd be able to sue for labels being wrong. You cannot and why? Because they are not standards.
You can say 4K is one thing, but most of the market says it is something different. You can't go to court about that because 4K is just a marketing term, not a standard.
-
@Mike-Ralston said:
And regardless of WHEN the terms were coined, they're still the industry standard, and the lines are very clear as to what term applies to what resolution.
Your own definitions of usage have shown that these things are not accepted standards. Go to a store and HD, UHD and 4K don't mean what we accept them to mean. So there is no standard.
HD is not an industry standard. And "industry" doesn't exist here as there are at least six industries involved, none of whom agree on terms!
-
@Mike-Ralston said:
The only time that this isn't clear, is when people haven't been educated about the proper terms, which is perfectly fine, but the industry still uses accepted standards. Everything in the electronics industry is carefully categorized, display resolutions are the same way. And, as always, marketing is usually not correct, and popular belief tends to be based on what someone saw on Facebook, or a TV advertisement.
This is not true. In fact, if you know what 4K is "supposed to mean" (by whom, exactly, as there is no standard) you are mislead as the television industry doesn't support the "standard" that you claim exists.
In fact, the television industry would say that it is you and your "standard" that are wrong as they clearly have a standard with manufacturers that does not agree with you.
-
@scottalanmiller All standards for display resolutions were set out by NTSC and VESA.
-
-
@Mike-Ralston said:
@scottalanmiller All standards for display resolutions were set out by NTSC and VESA.
Um, no. NTSC was for only one country and didn't set international standards. VESA is just a company that sells standards and hasn't been a major player in decades. They were anything but standards for "all" displays and neither has had any important role in a very, very long time.
-
@Mike-Ralston said:
https://en.wikipedia.org/wiki/VESA
https://en.wikipedia.org/wiki/NTSCNTSC has been obsolete for many years (it be came illegal in 2009) , ATSC is the current standard. However they specify the standards of broadcast "legal" video.
-
@thecreativeone91 said:
@Mike-Ralston said:
https://en.wikipedia.org/wiki/VESA
https://en.wikipedia.org/wiki/NTSCNTSC has been obsolete for many years (it be came illegal in 2009) , ATSC is the current standard. However they specify the standards of broadcast "legal" video.
Which, while important, is a very minor slice of American only specs. Broadcast is almost trivial these days (does not apply to cable, computers, Netflix, DVDs, BluRay, etc.) and is US only (which is a big player, but only one of many not as big as EU or Chinese standards.)
-
@scottalanmiller said:
VESA is just a company that sells standards and hasn't been a major player in decades. They were anything but standards for "all" displays and neither has had any important role in a very, very long time.
VESA is the group responsible for standardized monitor mounting, and the highest bandwidth consumer display connector currently available, DisplayPort, which is the only standard currently able to support Adaptive Synchronization. VESA is made of a large number of corporations who get to decide what the de facto standards are, that everyone else follows. Maybe they aren't OFFICIAL standards, but they are the industry standards that everyone involved in the display panel industry knows, and they are widely accepted. So, I was incorrect for saying they are Official Standards, as they weren't set forth by the FCC or some governing body, I apologize for the misinformation.