Monday, December 05, 2016

Dawn of Ultra-High Definition: PSA on DAWN OF THE DEAD 4K, UHD-TVs, and HDR




WHAT THE HELL DID NICOLAS WINDING REFEN
DO TO THE POSTER FOR DAWN OF THE DEAD?!

No clue, friends. But that aside, we've got some good news!

For those who may not be aware, Italian label Nightmare Factory is releasing several editions of their all new 4K restoration of DAWN OF THE DEAD. Yes, this release is reported to be Region B locked, and yes, rumor has it that the "restored" transfers will all have forced Italian subtitles - but I'll just have to find out when my copy arrives, won't I?

What you need to know is this: The 4K restoration is the first new transfer minted for Romero's 1978 classic since the Rubenstein Company started their 3D conversion back in 2008, which has so far only been shown in a handful of theatrical screenings. For better or worse, the new Italian handled restoration is based on the 118 minute European cut - "Dario Argento Presents Zombie", if you know the film's convoluted and multinational history - but with Rubenstein having funneled a fortune into a 3D version literally nobody asked for, and reportedly holding the home video rights hostage until he's able to make his money back on a fat license fee, this is probably the only viable alternative we're going to see. If you're a Dawn of the Dead fan and you're craving a state-of-the-art 4K release... this is pretty much the only game in town for the foreseeable future.

For those who aren't exactly fluent in Italiano - and that includes myself - here's what the 4-disc edition contains in a nutshell:
  • - Restored HD version (2016 4K master) of the 118 minute European cut of the film
  • - HD version (2013 master) of the 127 minute American Theatrical cut of the film
  • - HD version (2013 master) of the 133 minute Extended Workprint
  • - Bonus BD with 2.5 hours of new content plus vintage trailers
  • - 5 postcards designed by fans of the film
  • - Booklet

I won't get too in-depth on the bonus features here since... well, it's a safe bet that 90% of them will be in Italian with no English subtitles. The 18 minute interview with Tom Savini will likely be in English, as may the 8 minute introduction by Nicolas Winding Refen (who seems to have gotten the ball rolling in this 4K Remaster project) but I'll be shocked if anything featuring Dario Argento, Claudio Simonetti and so on features any English dialog or subtitles of note. That said, if you already own any of the exhaustive, absurdly special-features packed DVD or Blu-ray releases from the last 12 years and you still need more, I... don't know what you're expecting to find... I'unno, maybe that 8 minute restoration featurette will show some cool Before/After footage?

The 6-disc edition contains all of the above, but adds two very enticing extra discs into the mix:
  • - "4K" UHD-BD of the 118 minute European cut of the film
  • - HD "unrestored" 1.37:1 open-matte version of the European cut
You should also know that there's reportedly a fairly major issue with the 4K Restored Blu-ray which I can only describe as "I-frame pulsing". Basically, every I-frame (once a second or so) is much sharper, and noisier, than the B-frames and P-frames that follow, which have something of a softer, more diffuse look. It's not even that the softness is the problem - it's the inconsistency in the middle of a scene to suddenly "pulse" a sharper frame after and to be followed by a string of more neutral, blurred frames. If you're an autistic crazy person who knows what GOP structure is, you'll instantly know (and probably hate) what you're gonna' see in this set... but, it's impossible to know how bad these things are via stills, so I'll just have to report back when I see it myself in motion.

For those so inclined to import - and can deal with the combination of region lock frustrations and possible forced subtitle shenanigans that always crop up on titles like this - the 4-Disc HD Edition can be had for about $36, and when I pulled the trigger on the 6-Disc 4K Edition it was for just over $50 - and that's with standard shipping included! I'm not getting a dime should you follow those links, I just want the world to know that this exists, and that while there's already talk of some compression annoyances, it's still going to blow the similarly themed Japanese box set out of the water - and that damn thing still sells for about $120!

There's also a 2-disc DVD for those who just straight up hate quality. I have little doubt a non-limited edition of the European remaster will be released next year for a lower price - likely without the American and Extended cuts, of course - but for the sheer volume of content you're actually getting in this particular package, it'd be a little crazy to hold out for something better.

SO DOES THIS MEAN KENTAI'S
DOING HIS FIRST 4K REVIEW?!

Sadly, no. But having weighed this one out for weeks, I'd rather take the time to explain why I won't be doing it - and why nobody but those with way too much money to burn should bother, either.

This is primarily because I haven't personally made the jump to "4K"/UHD, and while I've been horrendously tempted to pick up an Xbox One S just to review the 4K disc at 1080p... that would be rather pointless on a lot of levels. As such, I'll be doing a write-up on the BDs, which - considering how many things are wrong with all the HD versions of this film - should still be plenty fascinating to dig through on its' own, anyway.

This is a good time to rant about the current state of UHD, though: To be blunt, the display hardware - while absolutely mouth watering on its' own - simply isn't ready for anyone who actually understands what these displays are (and aren't!) capable of doing. I've been tempted to spend a small fortune on an overpriced OLED and lord it over all of my friends as they huddle around their sad, pathetic, peasant-LED's for warmth... but try as I might, I just can't convince myself it's worth it. Not yet, anyway.

Consider the following a primer of sorts for anyone who's about to take the plunge on a new display. My advice is "wait" - but if you're still dead set on being that guy, I completely understand. At least know what the hell you're getting into.

WHAT'S UP WITH THOSE FANCY
4K ULTRA BLU-RAYS, ANYWAY?

Thankfully, "4K Resolution" itself is pretty straight forward* - double the pixels in width, and height, meaning 3840:2160 UHD has exactly four times the resolution of 1920:1080 HD. The standard also allows for up to 59.94fps, meaning that James Cameron's fantasy of people wanting high refresh rates outside of video games and pornography could be a thing... if, y'know, anyone wants to actually produce content that isn't garbage dramas shot interlaced. (Or Hobbits. I guess.)

* Other than the fact that "4K" is referencing horizontal resolution while "1080p" was referencing horizontal, which literally makes fuck-all sense. Also, 4K spec is 4096 wide, the same way that 2K is 2048 wide, meaning that not only is "4K UHD" not by definition 4K resolution, but that "2K" should be "2K HD" - unfortunately people assume that 2K is inherently better than HD as a result, when pointing out "2K SCAN" vs "HD TELECINE" has more to do with how the image was captured and restored, rather than the actual resolution thereof. Because fuck TV marketing.

UHD-BD has some interesting quirks as a format; not only are discs available in single-layer 50 GB, dual-layer 66 GB and triple-decker 100 GB, but each of those discs has its' own maximum bandwidth limit - 82 Mb/s, 108 Mb/s, and 128 Mb/s respectively! Audio has largely been Dolby Atmos encoded, which is basically a 5.1 mix with metadata to "shift" the individual sounds around a grid of tiny satellite speakers; It's actually pretty cool tech, but completely impractical outside of an actual movie theater with a massive array of speakers installed to cover a wide area. Video - the part that interests me the most, I admit - is now handled by 10-bit HEVC, which - having done some preliminary test encodes myself - I can confirm it holds an almost shocking level of efficiency improvements over Blu-ray's most common codec, AVC, and suspect that in most cases, 100 GB is more than enough for an excellent, reference-quality end-user transfer.

Yes, of course, word is that there are already a handful of UHD-BD's with visible compression issues, and I'm sure that bitrate starved Netflix and Hulu streams will always be disappointing when it comes to grain structure - but if you didn't expect that, you didn't pay much attention to the launch of DVD, Blu-ray, or Netflix HD, did you? There are still some minor disappointments in the spec sheets - video is still subsampled to 4:2:0, and for another there's still "Limited" and "Full" color spaces to confuse and annoy everyone who can instantly tell the difference between PC and TV levels - but honestly, chroma bumped up to a full 1920:1080 is probably enough to satisfy even my crazy self.

But by far, the most promising part of this whole process is the introduction of HDR - or High Dynamic Range. In super-simplistic terms, it means two things; expanded color space, and higher peak luminances (whites). The actual colorspace for Rec. 2020 expands red notably to include those glowing, fire-engine-light reds that Rec. 709 HDTV colorspace simply was never designed to acknowledge, to say nothing of the other-worldly greens that... ironically, nobody's ever seen before in a movie. Seriously, the technology to produce them on a digital screen didn't even until recently exist, so unless you insisted on the most amazingly fabulous bright-ass neon green pride float to ever burst info flames in NoHo live, you've probably never seen colors quite like what 2020 is capable of producing once you hit those outer reaches of the gamut.


Thanks for the visual aide, Google Image Search!

It's worth noting that for these colors to even exist as a spectrum of light the human eye can see, there has to be a lot of light being pushed out by the display - considerably more light than we've ever had on consumer or even public exhibition screens until quite recently.

So... where's the problem? Kentai's down for 10-bit color, and this Deep Color stuff is pretty rad, right? Mo' Reds, Mo' Greens, and screens so bright you'll go blind - what's not to love?! Well... there's a couple things really chaffing my HDR boner, and if you've looked into it as long as I have you'll feel thoroughly cuckolded by the mistress of Rec. 2020 yourself.

PROBLEM NUMBER ONE:
NOBODY EVEN USES THOSE DAMN COLORS!!

To put this another way; have you ever looked at a movie and went, man, this scene is visually stunning - but if only there were deeper, more vivid greens? Well... I mean, maybe you have.  But to Rec. 709's credit, it's basically covered the overwhelming majority of the blue spectrum humans are capable of seeing, and while red can certainly be improved, the only major gains we get are in the side of the color spectrum that we associate with... golf courses. Dramatic lighting tends to be white, blue or red, and while higher color fidelity will lead to greater visual contrast in some titles and subtle improvements on things like color banding for pretty much everything, this is very much a subtle refinement in 95% of real-world uses.

Arguably, though, Rec. 2020 is a bit less important than DCI-P3, which was a color space specifically made to capture every possible color range of 35mm film. Well, more relevant as far as movies are concerned, anyway - I'm thinking of the obnoxiously saturated colors in games like the DOOM reboot and salivating at what might be with a new HDR profile...

Many titles in the initial batch of UHD-BDs - The Martian, Mad Max: Fury Road, and so on - were all, seemingly at least, re-graded from scratch specifically to show off how vibrant and pretty and magical this new format was. Stuff released by Sony... wasn't. Pineapple Express was hardly the sort of title I expected from the first wave, but a decent one to show that even with all the expanded color in the world, a movie shot with a drab, overcast visual style is always going to be a drab, overcast movie. I imagine the world will be surprised to see that the inevitable UHD-BD release of Caddyshack doesn't glow like the primary heavy hues of an X-rated Ralph Bakshi cartoon, but these are things that people will slowly figure out on their own.

They were all HDR enhanced transfers, make no mistake, and there's a difference in just how vibrant those titles can get in terms of red and green and particularly highlights on the bright end of the grayscale, but the difference is subtle refinements and increased fidelity in things like traffic lights and grass lawns - not show stopping crazy neon explosions. I have little doubt that young and enterprising film makers will make full use of these new expanded colors over time, but for now, this is basically just additional chroma headroom for your favorite movies; nice to have, sure, but the odds of you noticing a huge difference, even during an A/B comparison, aren't that bloody likely.

PROBLEMO NUMERO DUE:
THE DISPLAYS JUST... AREN'T THERE YET

Even if we assume that P3 is more important than 2020, there isn't yet a consumer level device capable of displaying that entire range of color yet anyway. Even when properly calibrated and setup, there will be a point of roll-off where the brightest green simply stops short of the full range of signal. The main reason for this is - assuming the hardware all down the chain is capable of accepting the full signal to begin with - to generate a wider color gamut, you need to produce a brighter peak white to carry the wavelength the color exists in.

TV manufacturers love to talk about wide color and High Dynamic Range and all the amazing stuff their new models can do that the competition can't - yet they're always oddly hesitant to talk about the actual cd/m2 rating -or number of "Nits" produced - because they know even the high-end OLED and LCD screens out there fall damn short of the recommendations by a wide margin. Simply put, if you're watching an SDR screen calibrated to the industry standard of 100 Nits, the brightest, deepest blue is only going to be 7 Nits and be pretty fucking dull - but if you're watching on a high quality OLED screen with a maximum output of 800 Nits, you suddenly have 56 different stops of blue to play with before maxing out. So it's not just the brightest, peak blue - it's all the subtle gradations between the light baby blue of the sky of early morning, and the deep hues of midnight that'll have a new level of depth to them.

For the time being, any OLED TV labeled HDR has to have a maximum output of 800 Nits, while any LCD with the label has to output 1,200 Nits - but even that's only on miniscule highlights, where a full screen of white is notably dimmer for both. The reason they have different scales is because OLED can actually turn individual pixels off, giving it greater perceived contrast than LCD, even with a lower light output. That said, both technology are fudging this stuff - OLED have the Automatic Brightness Limiters the same as Plasma, which means that while a small reflection can have a crazy high light output, a full fade-to-white will be substantially lower, and you can watch the whole screen dim as large, bright colored objects come into frame. LCD uses local-area based dimming, which... well, kinda' sucks, but in a completely different way.

In both cases, however, they all pale to a reference Dolby grading monitor that outputs a terrifying 4,000 Nits! We're likely not going to see that for home use until we come up with some crazy new technology to power it, sadly, but mentioning the word "Dolby" brings up the other big stinker in this new tech:

問題三THERE'S AN ALL NEW,
WEIRDLY EXCITING FORMAT WAR!

Without delving into I-could-be-compromising-NDA's-here territory, I don't mind saying that my day job consists of a lot of transcoding-server-based wizardry that'd come off as mundane and even disappointing to most readers - developing templates to convert one kind of file into another, automating broadcast friendly audio normalization to different international specs, dumb stuff like that. But the 4K content rollout happening right now has left me to be the front line in telling clients who want to get their feet wet what we can and can't give them, and holy hell, has it been equal parts enlightening and infuriating to follow.

The short version is that because each TV has different color and light capabilities, they've developed what they call "PQ" - or Perceptual Quantization - to make sure every display is capable of displaying the movie as closely as possible to what the graded master was intended to be shown at. In other words, the source media is always encoded as Rec. 2020, but a metadata setting tells the TV what levels it was graded at - so if the movie was graded on a 3,000 cd/m2 (or "3,000 Nits") monitor and your TV can only handle 1,200 Nits, it'll actually shrink the color gamut/dynamic range by 60% so that you keep as much of the color as possible without resorting to clipping and blowing things out of proportion. This includes values for absolute red, green and blue data as well, which means the chroma and luma of the signal are scaled to the appropriate points separately - which means if your TV can't handle the maximum brightness but it can handle the maximum red, it's not going to compromise one for the other.

It's actually a really, really cool idea, and having seen demos of it in action, I can say that PQ is a damn fine thing. Now, my understanding of HDR is that even if the color never falls outside of Rec. 709 (or P3), the contrast between black and white is effectively infinite, and only limited by the display itself. This, admittedly, makes the lack of HDR enhancement on the DotD 4K release something of a missed opportunity, but considering both how niche this title is nearly 40 years later, and how experimental the hardware is, I can almost forgive seasoned professionals not familiar with HDR for not bothering to dive into the new format incorrectly, and instead focusing on delivering the best quality SDR presentation they know how to... particularly when you factor in that, even with current HDR content, the average brightness between an HDR and SDR output is typically comparable - it's just got a lot more dynamic range, which lets you not clip highlights and crush shadows while still staying within the realistic 500~1,000 cd/m2 light output of the average consumer TV.

...SO, WHERE'S THAT WAR, EXACTLY?

The bigger problem is that there's different kinds of HDR Metadata. The two most common right now are HDR10, and Dolby Vision - the former has fixed coordinates, meaning the whole movie has a max output for each color from start to finish, while the latter is a dynamic solution, meaning each scene can be tweaked individually to compensate for low lighting - say, making sure a camp fire is displayed properly alongside the following scene taking place on an overcast afternoon. Adjusting PQ on a shot-per-shot is great, particularly when it comes to downscaling HDR content back to SDR - something you could easily do via algorithmic automation on a Dolby Vision master since it would scale everything right down to the Rec. 709 limit, but would require plenty of hand-holding on an HDR10 source to avoid crushed shadow detail and clipped highlights.

So what's the industry standard? There... really isn't one - and certainly not a permanent one. Oh sure, the Society of Motion Picture abd Television Engineers (SMPTE) put their stamp of approval on HDR10, but Netflix is 100% behind Dolby Vision, which requires hardware on the display that knows how to handle the dynamic metadata outside of apps. Samsung knows that Dolby Vision's dynamic properties are here to stay, so they're currently working on an open-source dynamic equivalent to Dolby's equivalent,.. but whether it'll require all new hardware or be shunned due to partnerships with Dolby is anyone's guess. (There's also rumor that HDR10 Metadata is limited to 1,000 Nits, but having seen it for myself I can tell you that's certainly not true. It is limited to 10-bit, however, while Dolby Vision is up to 12-bit.)

That's before we even talk about Hybrid Log Gamma, a really clever method to make HDR content backward-compatible with SDR hardware without any Metadata at all - it just outputs a fixed (pair of) gamma curves, and trusts that the device it's being fed to is calibrated somewhere in the ballpark of "correct". The downside is that it features no PQ, which means all levels have to be fixed on a static 1,000 Nit scale. That doesn't sound terrible at first, since most displays on the market are only hitting about 800~1,100 Nits anyway, but as it's only being used for what could charitably be called experimental broadcasts in Europe for the time being it's all kind of a moot point... for now.

Odds are there would be more support for it if the only thing that used it wasn't VP9, but word is H265 will be implementing HLG profiles for 2017, so buckle up kids - this shit's just getting started.

Праблема нумар чатыры:
3D BASICALLY FUCKED 4K FROM THE START

Even with all the exciting changes in perceptual contrast and color gamut... there's also the limitation of the resolution itself. Namely the fact that most Hollywood movies finished over the last 15 years or so are limited to 2K resolution.

Let me clarify that for a second: Despite professionals slowly but surely moving towards digital photography as a more cost-effective and (arguably) flexible atlernative to celluloid, movies are still - by and large, at least - shot on 35mm film. These days, 35mm is typically scanned in at 4K and then edited on a digital image sequence called a DI - or "Digital Intermediate". Despite confusing naming conventions surrounding it, 2K 1.78 is exactly the same resolution as 1080p HD - 1920:1080 - and since virtually all 2K sources will ultimately be seen on Blu-ray or HD streaming anyway, the difference between 2K and HD is really negligible, at best.

The only reason it's been sononymous with "Better Than HD" for some time is because of the difference between an HD Telecine - that is a real-time transfer of 35mm to HD video - and a 2K scan to uncompressed DPX image files, which are then graded and assembled on a 2K DI. In short, the "scan" part of the "2K Scan" is the important distinction - not the resolution itself.

Things get more complicated when you talk about a modern movie - something like, say, The Martian. While more or less every shot of Matt Damon was shot on 35mm film, it was then scanned into a digital file, and placed on a DI for digital effects matting, and even for grading and general clean-up on shots filmed entirely on-set. Those scans may well have been scanned at 4K, but the DI itself - which gets differing resolutions from different cameras, post houses and so on - is at 2K. Even if the 2K DI was kept (which may or may not be the case), all of the color grading and effects compositing was done at 2K resolution, meaning the ultimate digital file that represents the final output of the movie is limited to 2K.

"But wait!" I hear some of you thinking. "Isn't THE MARTAIN already out on UHD-BD?" It sure is! But the dirty little secret is that Fox didn't actually re-create the entire movie in 4K. They simply went back to that same 2K DI source and upscaled it by 4X to produce a faux-UHD master. To be fair, upscaling HD to UHD is a much less butt-ugly process than upscaling SD to HD... but it still isn't "really" 4K. And that's what about half of the supposedly "4K" UHD titles kicking around in the market right now are. Crazy, right?

...WAIT, ARE YOU SERIOUS?!
FUCK! I'M SO MAD I'M SKIPPING NUMBERS.

Dead. Fuck'n. Serious.

But it's hard to blame them in some cases, I admit. MAD MAX: FURY ROAD - another early "4K" home video release - was shot entirely on Arri Alexa cameras at 2.8K resolution (2880:1620), and then scaled down to a 2K DI. So even if the 4K transfer was re-rendered from scratch, the resolution shot on location was never full 4K resolution to start with.

It's understandable that all-digital movies would be limited to 2K, but why the limitation on 35mm sourced films? The short answer is "3D". Unless a movie was actually shot using 3D cameras, you're basically having people manually trace and rotoscope individual objects in post to add depth, and that means you need multiple layers of the same footage. And since this is a process done effectively at the last minute, and is also the way most Hollywood blockbusters make guaranteed extra money at the box office, it's seen as a necessary evil... even by directors like Guillermo del Toro and Zack Snyder, who are about as open in their disgust for the process as they're allowed to be before the producers give them a talking-to about shitting on the profit margins.

So why not just use 4K DI's? The short answer is that it's just not feasible to do it for all projects on the sort of turn-around time expected of a feature film - and it will have a sizable impact on budget, even if it's a fraction of the overall cost. It comes down to raw resources, and if you're a Hollywood producer who's trying to maximize profits, and the tech guy in the office says that 2K work is a lot cheaper than 4K work, odds are that's the first thing that's getting axed. This is why despite Sony scanning everything at 4K to get the most out of the raw scan possible, we get full 4K restorations of Taxi Driver but a title like Fright Night is scaled down to 2K for further clean-up and grading - the higher resolution scanning is baked into the cost of having the hardware to do it, but taking the time and bandwidth to continue working in 4K adds up pretty quick. As an example, a feature length 2K master of a 90 minute film will take up about 1.5 TB of disk space alone! A 4K DI has four times the resolution, and yes, will take up four times that space, to say nothing of the added strain on whatever network is forced to decode that much raw bullshit at any given time.

SO, BIGGER FILE SIZES...
SO WHAT? JUST, MAKE 'EM BIGGER?

For better or worse, I spend every day dealing with network latency, side-eyeing mismatched source content and queueing up transcodes for both production and delivery purposes on a server that was effectively built to handle HD material in real-time. Think of it this way: Even when all you're doing is jamming cuts of meat down a conveyor belt into a meat grinder, suddenly cramming four times as much meat into the same sized pipes is gonna' cause all sorts of panic you, and the rest of your team, simply aren't ready for. You've got three seasons of SD content? Beautiful, spread the cheeks on that thing and go to town. You've got THREE 4K source files?! We're basically fucked the rest of the day, and anything not-4K-related that you expect to get out the door by 5 is going to require a very special request.

Even if your goal is simply to take the finished master and convert it to 3D, keep in mind that conversion is effectively rotoscoping, and multiple people are going to be working on the same title at the same time, and the bandwidth required to hit the same 6TB file from multiple workstations is ridiculous. It can be done - native 4K DI's for titles ranging from The Smurfs 2 to Elysium are proof enough - but if 3D is part of the picture, it's barely worth the expense, and allows you to farm work out to multiple, less insanely equipped sub-studios. There is, technically, "4K 3D IMAX" in select theaters... which are always, without exception, upscaled from the 2K 3D masters even if a 4K master for the "Flat" version exists.

The process of rendering a full length movie is often simply too data-intensive (ie: "too expensive") to justify doing at resolutions beyond 2K. The market for higher-quality 3D is already negligible, and quite frankly, the overwhelming majority of consumers are so clueless they wouldn't know the difference between 720p and 2K - forget HD and UHD. Hollywood caters to the lowest denominator with the biggest payout... which happens to be 2K 3D. The saddest part of this reality is that while I'm not thrilled at the idea of purchasing a 2K DI sourced movie in "upscaled 4K", the added benefits to H265 compression, PQ contrast and HDR color are enough to get me to at least consider buying the more expensive version, even if I don't have a fucking display that'd know what to do with any of it yet. The fact that virtually all UHD-BDs released so far include a "Restored in 4K" Blu-ray copy only makes that jump a little easier to swallow.

In short, 2K DI was easy to convert to 3D, which means nobody wants to invest in 4K DI. Weak.

BUT WHAT ABOUT OLDER MOVIES?!
TELL ME OLDER MOVIES ARE OKAY...

Anything finished on 35mm can be re-scanned at 4K. Otherwise I'd be warning you about Dawn of the Dead rather than throwing fitty bones at it. But realistically, the leap between 2K resolution and 4K resolution is probably going to be quite a bit more subtle in most cases. Most sane people don't sit within 6 feet of a 55" TV, and the real-world gains in resolution are going to be pretty subtle. If anything, the most obvious improvements will come from the variable block-sizes and 10-bit refinement of H265 as a codec... but neither of those things actually needed higher resolution to begin with. Much like "2K" vs "HD", the difference has more to do with the mastering process than the actual output resolution. 4K is better, make no mistake, but it's also not why most 4K masters are a dramatic improvement on dated HD scans on its' own. That's why the "Remastered in 4K" versions of Ghostbusters and Sam Raimi's Spider-Man looked so much better than the prior Blu-ray releases, even at 1080p resolution.

Don't get me wrong, you should buy 4K restorations of classic films if you have the money to taste. But the market is going to be even smaller than Blu-ray has already become, and I have a feeling the titles we see are going to be titles that the guys running the labels know will sell, and titles they happen to love personally. Dawn of the Dead is something of an outlier, and I fully expect the majority of 4K movies to be trash nobody would ever actually want to watch outside of penance.


WAIT, WHAT ABOUT THAT NEW PS4...

I may talk about the PS4 Pro in another post. Short version is it isn't really 4K, just clever upscaling and some supersampling for any schmucks buying this for playback at 1080p (complete with frame-drops not featured on the base hardware!), but for $400, it's "close enough" to UHD resolution.

Holding aside the Metadata and Display Gamut nonsense, one of the few times where that crazy expanded colorspace really would be an amazing boon would be video games. Digital creations aren't limited by 35mm exposure, in-camera sensors or available local lighting, and the thought of the boiling reds and alien greens in something like, say, the 2016 DOOM reboot with enhanced grayscale and color fidelity could be absolutely brilliant if done properly - and a handful of PS4 and Xbox One games already have full HDR support, even on the older-model consoles, though of course as displaying HDR samples on SDR monitors is pretty much impossible there aren't many useful A/B comparisons floating around yet...

The issue, however, is how HDTV's handle those signals. Most displays process incoming footage in different ways - upscaling, deinterlacing, color correction and so on - and in the case of movies, it... doesn't really matter. In virtually all cases movies are shot, edited and shown at 23.98fps and play at a locked framerate from start to finish. No problems there. Anyone who plays games - well, anyone who plays shooters, rhythm games, anything where instantly reacting to game stimuli is required - will know that reflexes are important... but also rendered completely fucking moot if your display has any major input lag.

Back in the days of NTSC CRT, there was no delay to speak of - the input was virtually instantaneous to the analog connection - but HDTV's and now, UHD displays, require time to properly process whatever signal they're being fed. Most game consoles and anyone gaming on PC that isn't using a specialty designed high framerate monitor is locked to 60fps (well, 59.94Hz if you wanna get technical) which means there's exactly 16.66ms between each frame. Theoretically. Slowdown and frame-pacing are a thing, but let's ignore that for the time being and just say that a great TV will give you one frame of delay, while an average TV will give you about two frames worth. And, yes, video game mechanics themselves are slower now than they were a decade ago to account for this phenomenon, if you can believe it.

*SMUG SNORT!* PC MASTER RACE HERE, PEASANT!
I GET A HUNDRED FRAMES PER SECOND. ON ULTRA.

Yes, you do. And all of that is irrelevant when discussing delay, because it's based on actual milliseconds going into the display. Desktop style monitors typically have ~15ms delay these days, even on cheap garbage monitors, but you're never gonna' get that sweet, sweet HDR profile for Deus Ex: Mankind Divided on a 1440p screen that runs 144hz, now are ya?

And yes, I've recently upgraded to a GTX 1080. Feels good to play 2160p, even if it's currently being scaled down a 1080p. What is aliasing, again?

When you buy a TV and you know you're going to play games on it, you probably set it to "Game" or even "PC" input. Why? Because that disables plenty of the internal processing and lets you get data in as fast as humanly possible. ~33ms is typically as good as a large display is going to get, and is fairly playable at anything that isn't a competitive shooter or fighting game - but anything after that is going to introduce an obvious delay that could be annoying at best, and unplayable at worst. The only monitors with better response times tend to be smaller desktop monitors that forego a lot of the basic conveniences of "TVs" - nothing wrong with that, but if you're sitting 2 feet away, you probably don't want a 55" UHD monitor to start with. (Well, I do. But I also want a three foot erection that breathes fire, so grain of salt there.)

With all that in mind, let's remember that UHD with HDR is not only feeding four times the resolution and an expanded gamut over the same connection as an HD master, but it's also got to equate the HDR metadata so the image is displayed properly. In other words, HDR was designed to be used for movies more than games, and the hardware implementation reflected that... at the expense of games being actually playable. While plenty of TVs have had firmware updates to speed up response times, plenty of sets still have over 60ms of response time - or about  a 4 frame delay - while Sony's current Bravia line - the one marketed as the "Perfect Match" for the PS4 Pro, no less! - has over 100ms of lag on UHD with HDR enabled. In other words, it's fucking unplayable.

To be fair, a lot of current models have had firmware updates that dramatically cut down on input lag once proper diagnostics by third parties. For the love of Pete, always check Rtings.com, though I assume the first generation or two of HDR sets (which'll probably be really cheap if you can find them kicking around) are never going to be updated, largely because people who do play games don't dive in when it's crazy expensive and can burn-in like a mother. Ask anyone

Even so, the fact that this wasn't even a consideration is shocking to me. Truly, if there's one thing I want out of a $1,500~5,000 TV, it's for my natural shitty skill at vidya to be exponentially amplified by a full tenth of a second. I need to respawn in every room six times on Ultra Violence anyway, so the last thing I need is to have the game trailing behind my obvious lack of skill.


I'll stop mentioning Doom 2016 when I'm damn good and ready to.

SO... WHAT'S IT ALL MEAN IN THE END?
IS KENTAI SAYING -DON'T- GET 4K HARDWARE?

Light output needs to improve before the colorspace is ever going to reach P3 levels (with "100% Rec. 2020" being science goddamn fiction for the foreseeable future), and unfortunately edge-lit LEDs - the cheap, lightweight garbage people love to waste money on - aren't capable of that to begin with. Despite being incredibly pretty, OLED isn't as bright as a full on baclit LED (yet). As the tech improves and the price comes down, it'll start to be feasible to have both full-P3 and at least 1,200 Nits - which, from any sane perspective, should have always been the bare minimum for "HDR". Instead it's being used as a gimmick to separate the cheap 4K sets from the expensive 4K sets, which is fucking deplorable.

There's nothing wrong with a 4K SDR monitor, and I've considered getting one myself - but the HDR monitors that are available right now are so bare minimum, it kinda' makes me want to have no part of this shitshow until the 2017 models have locked horns long enough to get at least a 100% P3/1,200 Nit standard. Maybe by then the HDR Metadata will have been figured out, too - though I wouldn't get my hopes up on Streaming and Disc having a single, unified standard, much as that would make
every consumer's life a lot easier.

Unless you get a crazy good deal on something really, really good, I say save your money for another year. The tech has to mature, and it's only going to get cheaper anyway, and unlike Blu-ray and the PS3 a decade ago, the Xbox One S just... isn't an appealing enough console to justify getting as a cheap media player with games. Not for someone with no love for Halo and Gears of War, at least.

As for the Dawn of the Dead box, we'll have to talk about that another day.

3 comments:

THEGODDAMNZOLLMAN said...

Dawn like Germany's Texas Chain Saw Massacre UHD release isn't even in HDR.
It's just 10bit 2160.

Michael said...

Really great summary of the UHD situation, Russ. Thanks for writing it - I learned a great deal.

I've actually picked up a cheapish 4K LCD (with HDR) and UHD player for Christmas. I'm aware it's less than ideal (jumping on the OLED bandwagon simply isn't within my budget right now), but my impatience got the better of me and I've never been one to sit on the sidelines when a new format is in play. Over the last 12 months or so, I've been buying UHD versions of new releases wherever possible and so have a ready-made library of about a dozen titles to try out on it come the 25th.

As for DAWN OF THE DEAD, with the € to £ exchange rate being what it is, I couldn't justify it. Be very interested to read your thoughts, though.

Anonymous said...

Out of humourlarity, have there been any 4K UHD releases with DNR smeared on them yet? I've seen the compression issues of course.

Definitely not getting an Xbox One Ass until at least Red Dead 2 is out, and it'll be even cheaper by then.