4K vs UHD
-
@scottalanmiller said:
UHD is standardized only in Japan, it looks like, but it is still an English language description that can't be a standard, in general terms.
UHD is the standard for 3840 x 2160, globally. Marketing teams call it 4K, and then suddenly the lines are blurred. It isn't 4K, it's UHD. -
@Mike-Ralston said:
Just because people said high-definition, those are just words people use to describe it. High-Definition is now a standard
High Definition is not a standard. all it means is resolutions higher than standard definition. It's just a umbrella marketing term that really 4k and UHD both could fall under.
-
@Mike-Ralston said:
Just because people said high-definition, those are just words people use to describe it. High-Definition is now a standard, and since the resolution was developed, it has been called as such...
Maybe it is a standard to YOU but it is not what people accept as a "standard." It's just a loose term that some people claim means one thing and others claim means another. It's an English description and not eligible to be a true standard, ever. This is just how language works. That you feel it is a standard or someone hopes that it will become one or people feel like it has been back-ported to one simply doesn't make it so. It is not a standard and claiming it is doesn't change that. The term predates the standard, you can't reverse these things.
Cloud Computing is different. The term was defined before first use and has an owner. It's a specific term that did not exist prior to use as it is used today and that's why there is an official (in the US) standard for it.
HD is just wishful thinking. We had HD before there was any thought of there being a standard and calling things HD now that they are obviously not is really no longer correct by any stretch. 720p, for example, cannot even remotely be considered higher than normal definition.
-
@Mike-Ralston said:
@scottalanmiller said:
UHD is standardized only in Japan, it looks like, but it is still an English language description that can't be a standard, in general terms.
UHD is the standard for 3840 x 2160, globally. Marketing teams call it 4K, and then suddenly the lines are blurred. It isn't 4K, it's UHD.Except 4K is just a marketing term too. I think you are misunderstanding the concept of a standard. There is very obviously no standard for any of these terms. Different groups (consumers, cinema, manufacturers, marketers, television broadcasters, countries) use them differently and it is accepted.
If there was actually a standard this would violate legal marketing and you'd be able to sue for labels being wrong. You cannot and why? Because they are not standards.
You can say 4K is one thing, but most of the market says it is something different. You can't go to court about that because 4K is just a marketing term, not a standard.
-
@Mike-Ralston said:
And regardless of WHEN the terms were coined, they're still the industry standard, and the lines are very clear as to what term applies to what resolution.
Your own definitions of usage have shown that these things are not accepted standards. Go to a store and HD, UHD and 4K don't mean what we accept them to mean. So there is no standard.
HD is not an industry standard. And "industry" doesn't exist here as there are at least six industries involved, none of whom agree on terms!
-
@Mike-Ralston said:
The only time that this isn't clear, is when people haven't been educated about the proper terms, which is perfectly fine, but the industry still uses accepted standards. Everything in the electronics industry is carefully categorized, display resolutions are the same way. And, as always, marketing is usually not correct, and popular belief tends to be based on what someone saw on Facebook, or a TV advertisement.
This is not true. In fact, if you know what 4K is "supposed to mean" (by whom, exactly, as there is no standard) you are mislead as the television industry doesn't support the "standard" that you claim exists.
In fact, the television industry would say that it is you and your "standard" that are wrong as they clearly have a standard with manufacturers that does not agree with you.
-
@scottalanmiller All standards for display resolutions were set out by NTSC and VESA.
-
-
@Mike-Ralston said:
@scottalanmiller All standards for display resolutions were set out by NTSC and VESA.
Um, no. NTSC was for only one country and didn't set international standards. VESA is just a company that sells standards and hasn't been a major player in decades. They were anything but standards for "all" displays and neither has had any important role in a very, very long time.
-
@Mike-Ralston said:
https://en.wikipedia.org/wiki/VESA
https://en.wikipedia.org/wiki/NTSCNTSC has been obsolete for many years (it be came illegal in 2009) , ATSC is the current standard. However they specify the standards of broadcast "legal" video.
-
@thecreativeone91 said:
@Mike-Ralston said:
https://en.wikipedia.org/wiki/VESA
https://en.wikipedia.org/wiki/NTSCNTSC has been obsolete for many years (it be came illegal in 2009) , ATSC is the current standard. However they specify the standards of broadcast "legal" video.
Which, while important, is a very minor slice of American only specs. Broadcast is almost trivial these days (does not apply to cable, computers, Netflix, DVDs, BluRay, etc.) and is US only (which is a big player, but only one of many not as big as EU or Chinese standards.)
-
@scottalanmiller said:
VESA is just a company that sells standards and hasn't been a major player in decades. They were anything but standards for "all" displays and neither has had any important role in a very, very long time.
VESA is the group responsible for standardized monitor mounting, and the highest bandwidth consumer display connector currently available, DisplayPort, which is the only standard currently able to support Adaptive Synchronization. VESA is made of a large number of corporations who get to decide what the de facto standards are, that everyone else follows. Maybe they aren't OFFICIAL standards, but they are the industry standards that everyone involved in the display panel industry knows, and they are widely accepted. So, I was incorrect for saying they are Official Standards, as they weren't set forth by the FCC or some governing body, I apologize for the misinformation.
-
@Mike-Ralston said:
@scottalanmiller said:
VESA is just a company that sells standards and hasn't been a major player in decades. They were anything but standards for "all" displays and neither has had any important role in a very, very long time.
VESA is the group responsible for standardized monitor mounting, and the highest bandwidth consumer display connector currently available, DisplayPort, which is the only standard currently able to support Adaptive Synchronization. VESA is made of a large number of corporations who get to decide what the de facto standards are, that everyone else follows. Maybe they aren't OFFICIAL standards, but they are the industry standards that everyone involved in the display panel industry knows, and they are widely accepted. So, I was incorrect for saying they are Official Standards, as they weren't set forth by the FCC or some governing body, I apologize for the misinformation.
What's any of those standards have to do with HD, 4k, Film vs broadcast etc, though?
-
@thecreativeone91 HD is commonly accepted as 720p, 4K is properly known as 4096 x 2160, and so forth. Broadcast isn't standardized, as every network may choose to broadcast at a different resolution or aspect ratio. Film is done at 1.85:1 or 2.35:1 most commonly, and this can be viewed in it's proper glory on 21:9 aspect ratio monitor, a new generalized "standard" that has been out for a few years.
-
@Mike-Ralston said:
@thecreativeone91 Broadcast isn't standardized, as every network may choose to broadcast at a different resolution or aspect ratio.
That's not true for broadcast.
-
@Mike-Ralston said:
@thecreativeone91 HD is commonly accepted as 720p
What? 720 is the entry HD thing that was a way to ease into higher resolutions. Full HD is the 1920x1080.
Also the P is irreverent to resolutions. That refers to progressive video. Many broadcast stations still do interlaced which manes two "frames" are put in one by using upper and lower fields of the video. HD, SD etc, etc. Can be either Progressive or interlaced. -
@Mike-Ralston said:
@thecreativeone91 HD is commonly accepted as 720p
"Commonly" is a tough term here. Commonly by consumers being sold cheap displays? Commonly people know that marketing people will use this term to fool them? Sure, that might be common, or maybe not. Normal people don't understand any of these terms. The number of people being sold them is many times higher than the number of people with some understanding of them and the number of people who really know what is intended or being said is a small subset of that.
Ask an average person what HD means, and they will probably have no idea what 1080p is but they will likely state that it is "high definition", which is the opposite of what 720p is today.
-
@thecreativeone91 said:
@Mike-Ralston said:
@thecreativeone91 HD is commonly accepted as 720p
What? 720 is the entry HD thing that was a way to ease into higher resolutions. Full HD is the 1920x1080.
Also the P is irreverent to resolutions. That refers to progressive video. Many broadcast stations still do interlaced which manes two "frames" are put in one by using upper and lower fields of the video. HD, SD etc, etc. Can be either Progressive or interlaced.My Dreamcast was 480p SD. No widescreen, but it looked good.
-
@thecreativeone91 Maybe broadcast is more locked down in the US, but with Digital Broadcasting, a network can stream at the resolution and aspect ratio that it wants. The TV receiving it will downsample or stretch the image, but that's up to the hardware on the user end. And HD does not refer to Interlaced Scan video, only Progressive Scan. Interlaced is used for TV broadcasting, and the occasional piece of professional equipment, most everything else is progressive scan.
-
@Mike-Ralston said:
@thecreativeone91 Maybe broadcast is more locked down in the US, but with Digital Broadcasting, a network can stream at the resolution and aspect ratio that it wants. The TV receiving it will downsample or stretch the image, but that's up to the hardware on the user end. And HD does not refer to Interlaced Scan video, only Progressive Scan. Interlaced is used for TV broadcasting, and the occasional piece of professional equipment, most everything else is progressive scan.
Broadcast is VERY stringent in the US. What is allowed to be broadcast is crazy specific because it uses publicly shared airwaves. Broadcasters get a few choices, yes, but they are all pre-determined and very, very specific.