This is where movie babies come from. DON'T LOOK!
Cathode-Ray Tubes have been used to transfer film to video signals since the 1950s, and are still being used by some film labs to this very day to create HD masters. This is essentially accomplished by shining rays of light generated by a CRT (A) through the film print (B), which are then reflected by photon-sensitive mirrors (C/D) that separate out the red, green and blue information, which is then converted by photomultiplier tubes, or PMT (E/F/G) into measurable light data an three separate primary colors, which is then digitized into 0s and 1s and exported as HD video.
This is what SCIENCE looks like.
(Also, mediocrity and obsolescence.)
In other words, there's about six completely separate analog components that need to be perfectly aligned and functioning before any of it goes digital, and if any part of the chain is malfunctioning it can produce unwanted side effects - including blurring from poor physical calibration, and analog noise from failing vacuum tubes. It could also produce uneven color levels if just one of these components isn't working right - let's say a lack of blues, which makes everything else in the image too yellow. A colorist can only correct what was actually captured during the film transfer, so ensuring that all three colors are properly accounted for is extremely important.
Now here's where it gets kinky...
Most - but certainly not all - modern film transfers use a CCD Telecine. Basically, they use a xenon based flashtube (A) to produce an instant flash of white light, which passes through the film print (B) and is then separated by a light prism (C) and/or, again, light sensitive mirrors (D). This prism separates out the red, green and blue information, and sends it to three separate Charge-Coupled Devices, or CCD (E/F/G), which transforms all of that analog light information into digital color data.
It actually works on a very similar principle as the CRT device, it just does it using more modern components. The big difference between them is the fact that CCD is considered more stable technology than CRT; the xenon bulbs are cheaper, easier to replace, and last longer than a CRT tube. CCD array based telecine devices generally don't have alignment problems like CRT scanners do, either. They both go about capturing analog film information via flashing light through celluloid towards analog-to-digital sensors, but if one costs less to upkeep and doesn't require semi-regular calibration, I'd say it's the superior format by default... but let's not rule out CRT without good reason, right?
As I've pointed out before, LVR is using a Cintel DSX as their only advertised telecine device. Cintel is a UK based manufacturer that makes exclusively CRT based hardware, and if you click the fourth image down on that page the caption even reads "Flying Spot CRT Telecine System". Whether LVR or Cintel (and other related CRT devices) is really "at fault" for the final product is up for debate, but there's no dodging the fact that this is what they're using to make Media Blasters' shockingly shitty looking HD transfers. It's also been reported by John Sirabella that Beyond the Darkness was transferred three separate times, which makes me wonder if that one project was done on another device entirely... if nothing else, it certainly looks different from anything else MB or BU have released in HD.
Even on Cintel's own website, they list the following 'negatives' for CRT scanners:
- Seen as expensive as the CRT can require replacement every 2 or 3 years.
- Can require complex circuitry to mask ageing effects.
- Sometimes requires expert alignment.
Their words, not mine. Amusingly at the end of the FAQ, the one real advantage in creating the final product that a CRT device has over a CCD scanner - and this is according to the guys who make them for a living - is "a deeper, filmic look". That's right, the only advantage they're able to pull out of their spinning mouths is that CCD produces "a flatter, cleaner, video look". And you know what? That's totally believable. In fact when I watched Taxi Driver on Blu-ray, you know, probably the single sharpest, grainiest, and most film-like transfer to hit stores in the last 12 months - made completely on a CCD scanner, make no mistake - the only thing I could think was "Wow, this looks pretty good... but if only there was some CRT noise it'd be PERFECT!"
A "deeper, filmic look", huh?
As I've said before, this is not film grain. This is exaggerated video noise being caused by a less than optimal CRT film scanner. Wither the film scanner is completely worthless or just needs one hell of a tune-up I can't rightly say, but I can (and will) say without hesitation that I own DVDs with more detail than these two "1080p" atrocities.
This has got to stop, and the only way things are going to change is if we demand it. Don't buy shitty releases; wait for reviews from someone you trust, and don't bother upgrading from a good DVD to a mediocre Blu-ray. For fuck's sake, if we can't think of a better use of our hard-earned money than that, we're clearly not trying hard enough!
If any of this information (basic as it's supposed to be) is misleading or incomplete, I do apologize. I'm not a professional lab colorist or a telecine repair tech or whatever, I just gather info on the subject from both private and public sources as I'm able to. If any readers out there ARE colorists, or otherwise have experience with both CRT and CCD setups, I'd absolutely love to hear your take on this issue in the comments.