Friday, December 14, 2012

Serial Experiments Lain: Layer 03 END

UPDATED TO REFLECT THE FACT THAT I'M NO
LONGER SUFFERING FROM MILD INSOMNIA-INSANITY!

 

 Let's talk about 8-bit color depth.

The short version is that while two video tracks may be compressed at the same exact bitrate - or even uncompressed, as the case may be - they can be "uncompressed" at different color depths. Most video is rendered and captured at either 8-bit or 10-bit, the former giving us a mere 256 "stops" (ie: different shades) of Red, Green or Blue, while the former offers a total of 1024 stops. The short version is the higher your color depth, the less banding you'll create. And, yes, this is exactly why there's been such a push in certain "sharing" circles to use 10-bit encodes, because it produces a better looking final product without actually increasing the bitrate - just the CPU power required to decode it on the fly.

HDCAM-SR is a 10-bit format. D-5 HD is a 10-bit format. Even good ol' SD Digibeta is a 10-bit format! But more importantly, Blu-ray and DVD are also 10-bit formats, and as such, absolutely everything in your workflow should include at least 10-bit color depth.

UPDATE: I've been under the above impression for years now, largely due to good MPEG encoders having always included an option for 8-bit or 10-bit source handling, but I clearly forget how goddamn old Blu-ray and DVD spec really is. I was mistaken, and 4:2:0 YUV is limited to 8-bit colorspace during the final BD and DVD transcode - but, it's also dramatically decreased the importance of those stops due to chroma subsampling, which has... basically destroyed the color in its entirety. Read about all that crap HERE if you aren't familiar.


Anyway, bottlenecking that data at the mastering stage - or even on the master itself! - can cause some pretty nasty side effects, like...



...you saw where I was going with this, right?

I've played around with some ideas in my head, but it wasn't until a charming little Anon asked me "Did you ever consider FUNi got an HDCAM source?" that it had even popped into my brain. I mean, Christ, why would they? It's not any cheaper than the newer and more technically robust HDCAM-SR upgrade format, and unless you're a cheap bastard playing one means you can play the other... then again, wouldn't that leave plenty of people, even industry-savvy types who do this day in and day out, numb to the differences between the two? Hell, I deal with both formats every fucking day and it never even dawned on me until now!

See, here's the crazy part in all of this; HDCAM, the predecessor to HDCAM-SR, hasn't hold up especially well over the last 15 years. For one thing, the resolution is shorn down to 1440:1080 pixels, creating a 16:9 image with non-square pixels. For another, it's literally the only thing in the world that uses a 3:1:1 chroma subsampling scheme, mostly because... you know, it's crazy. The bitrates of about 144 Mb/s were impressive in the late 90s, but HDCAM SR now offers three times that bitrate - or even six times that, if you're willing to cut down on runtime and use "HQ" encoding. But the most relevant part of the format when it comes to Lain is that it's natively an 8-bit format. Fancy that! A format where potential color banding is fucking BUILT IN to the spec!

Alternately, FUNimation - or Geneon/Universal, or whoever FUNimation has digitize their tapes - could have gotten a 10-bit HDCAM-SR master and ingested it as 8-bit material. Most video cards have an option to capture at 8-bit that'll core off the additional color depth, and yes, the most obvious flaw will be color banding on smooth gradients. In other words, FUNimation probably DID get HDCAM "classic" masters... or if they didn't, the material was likely treated as if it was, and that clearly didn't help anything.

UPDATE: "But if the delivery format is capped at 8-bit, what difference does it make if the source is 8-bit?" I Due to my own confusion, that's a perfectly fair question! But based on the limitations of HDCAM described above, it's akin to asking if you're going to release something on DVD, why not just re-encode an existing DVD instead of going from an archival master? You can, technically speaking, but every flaw on that initial lower-bandwidth source is going to show up on the new version as well. It's also entirely possible that the BD encoder is specifically designed to account for the 10-bit to 8-bit conversion in a way that HDCAM, a native 8-bit format, is not; this was the root of my confusion going in, the fact that for the last near decade I've dealt with MPEG-2 encoders that offered "10-bit sampling", which affected whether then smoothed the 10-bit source to 8-bit properly via dithering, or just threw the "extra" color data away.

Food for thought: When I first arrived at my current job, I was told we capture everything at 8-bit color depth. Doing a bit of research into our codec of choice, I wondered why I couldn't set it to 8-bits manually... only to find out that our codec settings were always 10-bit in nature. We were coring off color depth in an attempt to save space, but since the codec is configured for 10-bits no matter what you feed it, we were literally just losing color data during capture! Having confirmed my suspicions  I explained to the people who oversaw this part of the process why that's a bad thing, and with some samples to prove that capturing everything at 10-bit wouldn't affect file sizes and would only increase quality... well, that was the end of it. 10-bit captures became the norm, with HDCAM upsampled during capture to 10-bit and all 10-bit formats ingested exactly as they should be.

Don't get me wrong, HDCAM was an okay format... for 1997, when its only real competition was D-1. We've come a damn long way since then, and the only sane reason I can see even looking at this dinosaur of a tape format is if you intend to have the show broadcast. Sadly, some of the largest television providers running today expect HDCAM 1080i and nothing else, so... yeah. Wrap your brain around that. It'd be like if someone only wanted your movie on Laserdisc, not DVD, even though there's a dozen reasons the former is an inferior format to the latter.

Sadly, 8-bit versus 10-bit conversions gone awfyy is something that'll probably haunt us for a while, especially in retards regards to international content*. It's a little sad I didn't even think about this as a possibility until now, but considering I did everything I could to eradicate doing anything at 8-bit related where I work, it's a problem I've eliminated in our workflow, out of sight, out of mind. As I said, we get HDCAM on a regular basis, but most of the material we see are HD broadcast masters from up to a decade ago, so I can promise you that 8-bit color stop limitations are typically far from the biggest problems those materials have. You haven't lived until you've seen a hard-interlaced 1080i -to- 1080p conversion, let me tell you!

This'll be the last time I talk about LAIN on this site, I swear. Honestly, I'm sick of it at this point; the release is what it is, and now we might have a definitive reason as to why. I don't regret spending $60 on it, and even Mrs. Kentai insisted that the box - heart breaking or not - stay on her shelf, not mine. FUNi tried their best, they just got handed an inferior format and probably never thought twice that "HDCAM" wasn't as good as "HDCAM SR". People in this industry I consider more knowledgable than myself have shrugged the differences off, and I've forgotten what those differences actually mean, so it's entirely possible FUNimation never even knew.

Now you do, FUNi. Insist on HDCAM SR, not the 8-bit, lower-resolution, super-sumsampled older brother. Please?

*You just can't fake fuck-ups that smooth.Also I was in a rush, so I've re-written a few points to be less... jumbley.

3 comments:

Jefferson said...

"...especially in retards to international content."

lol

Kentai 拳態 said...

...y'know what? Keeping it. That's easily the best Freudian slip I've made in weeks.

Anonymous said...

>Blu-ray and DVD are also 10-bit formats

Apart from Deep Colour, as far as I was aware DVD and BD are both 8 bits/channel formats. Is this just a pervasive myth? I can't find anything to the contrary to 8 bits/channel via google, but I may just be searching with the wrong terms.