Tuesday, October 14, 2008

DV. It's short for "Devil".

A thought snapped into my head some time ago that I could bypass the comb filter in the DVD recorder, if only I could get my hands on a DV pass-through device. The basic theory behind it is that you feed analog video into the device, and then it digitizes every frame with synched audio, compressing the whole signal into a 25Mbps stream. I know for a fact that when the graphic designer I work with is designing DVD menus he's hooking his DVD player up to his DV camera, and then using it as a pass-through to transfer the new DV signal via firewire. It's much easier than capturing analog data because it's already a series of 1s and 0s, skipping the need to do all that frame synching with the audio and yadda-yadda. Sounds great, no?

...well, no.

The thing that proponents of DV (in NTSC countries) don't want to jump up and shout about is the chroma subsampling. The basic jist of it is that all analog NTSC signals are 4:2:2, or have 2 horizontal red color pixels and 2 horizontal blue pixels per every 4 black pixels in a grid. For various convoluted-yet-historically interesting reasons, the color (chroma) and black/white (luma) parts of the signal are carried separately, and since the human eye seems luma better than chroma, you can save a LOT of bandwidth by shaving off some of those chroma lines, with a minimal loss in clarity (or so we're told). You can see some visual examples of the theory here.

So, analog signals - old fashioned TV broadcasts, VHS, Laserdisc, what have you - are all 4:2:2. DVD is actually 4:2:0, or it has 2 distinct vertical pixels and "zero" (ie: a single block of 4) horizontal pixels. The zero doesn't imply that the data isn't actually there, it's just to calculate how much bandwidth you're saving. Yeah, I know. It's frickin' ridonculous. Even modern Blu-ray HD transfers are 4:2:0, so while Digibeta does keep 4:2:2 data, you've never actually seen it on a digital home format.

So, why is this important? Because much as DV is a decent format for shooting to tape with, and has a lot of advantages in simplicity, NTSC DV has the god-awful chroma subsampling rate of 4:1:1. In theory this means that it has worse color separation than the VHS tapes I'm trying to capture. Since the DV format works the same way in hardware as it does in software. I converted a short clip full of subtle color information (from a DVD recording, which is already 4:2:0)... these were the results.


DV

DVD

Yuck! Not only did it soften out the details in the background and around outlines, it generally screw with the color balance, added some visible chroma blocking that - in motion - looks absolutely horrifying, AND added noise on top, despite being over 3 times the bitrate of the original file! This is the price you pay for ease of use, and frankly I'll take no part in it. I need to pay anywhere from $150 to $1,000+ for the privilege of capturing crappy looking digital video to my PC?

No thanks.

Back to the TV Tuner route after all. It looks like there's okay-ish USB devices in the sub-$100 range, and at this point they're probably every bit as awesome in the hardware department as my ATI card ever was. Ah, technology...

No comments: