Color calibrating your monitor? (Seeing "true" color of your video captures)
Greetings Digitalfaq members!
My name is Mya and I have been a lurker for years and just joined up as I am going to be starting my own analog to digital video conversion projects. I've learned so much great information over the years here that I feel fairly confident my projects will be a success. I hope this is not a redundant question as I've used the search functions and didn't find an answer to my question. If there is anyone out there that can offer some friendly advice regarding the following, I'd certainly appreciate it... I'm wondering how you experienced analog to digital folks see the "true color" of your transfers on your monitor? I ask this because each monitor seems calibrated slightly different - some have colors that pop while others, perhaps older ones, might not be displaying the true color of what the capture really is. It also seems that certain brands of monitors have their own unique nuances, characteristics and visual enhancements. This could lead to user/archiver mistakes such as unnecessarily color-correcting a capture that doesn't need it, or brightening a capture that is already bright enough, but because of the display you're viewing it on, it just appears darker than the footage really is. Not seeing what the true transfer really looks like would be a recipe for disaster prompting unnecessary post-capture cleanup and could easily ruin the footage. In a nutshell....how do you know what you're seeing on your monitor is representative of what the capture looks like and not a skewed visual based on the way your computer/monitor is displaying it? I appreciate any tips, suggestions and ideas. Thanks so much! |
The most reliable way is to buy a quality professional video monitor (as opposed to a TV set or computer monitor), but they are not cheap. You also should have a good video output card for your computer as well for evaluating/editing the video you captured on the video monitor.
Most computer monitors and graphics cards are not great for video (they were designed for a different color space). TV sets may include various automatic signal processing capabilities "improve" the image. These include automatic color/tint, brightness/contrast range, noise reduction, and overscan issues. Thus they may hide imperfections in the video signal they are fed, or just be misadjusted (typical), and may lack the user controls needed to allow precise adjustment/calibration. I use a BlackMagic Intensity Pro to drive a video display from by NLE works tatuion (and to capture analog video). I also use a HD Spark to drive video output on a system from Edius NLE. If you want to check your capture system an option would be to feed it a known calibrated source (e.g., SMPTE color bars from a video signal generator) and view the resulting captured file with a NLE that includes a vector scope/waveform capability. |
Quote:
the poor's man solution would be to use videoscopes analyse the raw video and after editing with avisynth for example, at the end of your script: Code:
function int2mode(int index) { return Select(index, "classic", "levels", "color") } http://forum.doom9.org/showthread.php?t=164720 |
I always found it better to work on a TV as a monitor as you see how you will end up viewing it anyways. Colors will not be as precise as a 4:4:4 monitor, but it will be viewed as it would be seen on a DVD or distributed copy.
|
Quote:
A source file that is spot on (0 dB) and a viewing system that is spot on (0 dB) is good. A source that is -3 dB (low) and a viewer that is +3 dB (high), nets out to about even due to offsetting problems - but shadows and/or highlights may suffer. But if both are +3 dB (or -3 dB) we may have significant problems. |
Site design, images and content © 2002-2024 The Digital FAQ, www.digitalFAQ.com
Forum Software by vBulletin · Copyright © 2024 Jelsoft Enterprises Ltd.