The question is, does anyone care? Obviously, some do care passionately -- video engineers, product designers, filmmakers and serious home theater fans to name a few.
But by and large, most video being watched by the vast majority of people on the vast majority of days varies wildly in image quality, from a YouTube clip on your phone to a rerun on your PC to a Blu-ray on your big screen. The original (and transcoded) quality of the video source is obviously a big factor - nobody expects camcorder footage to look like Hollywood.
Whether in the home theater sphere, where you really can get state of the art technical reproduction (if you're able to pay for it), or watching on a PC, most display screens don't come close to displaying video images the way "science" says they should be seen. Reactions from the vast majority of the world generally range from cheerfully ignorant to "who cares?" According to the Cable & Telecommunications Association For Marketing (CTA), 94% of HDTV viewers are satisfied with their picture quality.
According to science, the brightness and contrast you're watching are likely way out of adjustment. If it's a new TV, count on it having been pre-set at the factory to be as bright and contrasty as possible so that the picture pops more. If you're watching on a PC, you likely have a selection of gamma values for the display screen -- who knows what it's been adjusted to and whether it's proper? Do you?
Even the distance you're watching at is likely wrong, so the problem there is less technical than good old fashioned human error - at least as far as science is concerned. Did you know that THX -- the guys who really care about picture quality -- recommends that you should sit only 6 feet from your 50" HDTV? In other words, about two arm lengths. Practical? No. Technically recommended? Yes.
I recently made a TV recommendation to a friend who was buying his first HDTV for he and his son to enjoy in their man cave. They ended up with a shockingly good deal on a 50" big-name set, as is increasingly the case these days. I offered to come calibrate the set, which sits about 15 feet away from their couch.
Though I'm someone who's attended enough of these technical seminars to know the TV was further away than optimally recommended, who was I to say so? They were watching a big, colorful picture that they were really enjoying. Even though the brightness was (as expected) set at the factory to burn like noontime, with contrast set way too high to try to compensate; even though the color temperature was set to "cool" as opposed to the technically "correct" warmer temperature; even though there was an unfortunate default picture "mode" that made everything look even worse -- they were loving it. They loved it even more after a quick calibration using only a test disc.
My point here is not to say that the standards shouldn't exist -- of course they should. But in the real world, they're just not terribly applicable, and part of a larger trend where pure technical "quality" is no longer the driving idea behind what people care about in their home entertainment.
Just up around the corner are newer, better display technologies than leave today's HDTVs in the dust. There's going to be a push for "4k" video -- also called "Ultra HD" -- to come to market, with double the quality of today's best 1080p sets. There's going to be a push for OLED TVs, which promise better picture quality than today's best LED, LCD and plasma sets. And let's not forget 3D that works -- I mean works -- without glasses -- that's coming too, and fast.
But will any but the rarified -- the ones schooled in the science - care? Is "better" picture quality still a goal worth pursuing with all those billions of dollars/yen/won in research and development, and all those thousands of dollars that the new sets will cost?
For some, the answer will be yes. For many, the answer will likely be a yawn.