What's new

Calibrating a monitor

The only way to accurately calibrate a monitor is with a hardware calibration device such as a Datacolor Spyder, or a X-Rite calibration device. Just using your eyes to calibrate a monitor will never work.

So, is this what I need: X-Rite ColorMunki Display Accurate Monitor Calibration CMUNDIS

I would like to keep the purchase closer to the $100 range than the $300 range, especially since at this point my work is just a (non-paying) hobby.

That should work fine. If you want cheaper you might look at the Datacolor Spyder 4. I have the Spyder 3 and it works quite well. My NEC monitor came with an X-Ride i-One calibration device and it works better though.
 
Once you get your calibration hardware, download dispcalGUI. It's free, open source, and better than the software that comes with a lot of calibration units.
Interesting. Quantify please. How is it better?

Try it out and see for yourself. I'm not the only one who thinks that either.

Basically everything is customizable. How many samples you want it to take, how many iterations you want it to run, etc... It takes longer, but you get a more accurate profile.
 
Once you get your calibration hardware, download dispcalGUI. It's free, open source, and better than the software that comes with a lot of calibration units.
Interesting. Quantify please. How is it better?

Try it out and see for yourself.
I don't care to put in the time and effort to DL it, install it, play around with it, learn about it, and all that it will take to actually find out for myself. You've already done so and presumably know all about it at this point, enough to promote it as "better", so just tell me how you know it's actually "better" with some data on HOW it's actually "better".

I'm not the only one who thinks that either.
That's great but, just as an example, the number of billions of Muslims doesn't convince me that I should join their religion just because they all say it's "better". I'm the sort of person that needs some actual facts and data to base my decisions on, not just popularity and the gut feelings of others.

Basically everything is customizable. How many samples you want it to take, how many iterations you want it to run, etc... It takes longer, but you get a more accurate profile.
How do you KNOW I'll get a more accurate profile? That's what I want to know. Where's the actual data that supports that statement? Where's the comparison? Show me the science that supports that statement. Most importantly of all, is it a difference that makes ANY difference to human eyeballs?

Thanks in advance for you help with this.
 
You're arguing semantics Buckster, but lets take a look at some of the options that come with popular colour calibration tools shall we?

Case one: Xrite Eye-One Match: I'm given the choice of:
A few selectable set of White balance targets
An input wheel to select gamma curve from 1.0 to 3.0

And that's it. That was short. Right off the bat I don't like it already. What about L* curve or the sRGB curve? When the software is finished it gives me a before and after shot, where's my graph of DeltaE values over brightness changes? No support for my monitor's internal LUT, no easy way to measure results in different locations off the screen to judge backlight bleeding, hell I don't even get a choice to limit max brightness or contrast ratio to meet the requirement of AdobeRGB viewing specs. Even the colour balance is almost fixed. There's a difference between D55 and 5500k which the software doesn't give you the option of, let alone custom entry of a specific kelvin value.

Heck the software that came with my monitor blows away the software that comes with most calibration units. White point calibration by x and y co-ordinate. Calibration to DICOM curves. Heck I can make my own arbitrary gamma curve and load it into the display lookup table.

And it looks like this software is one step even more customisable.

...

I'm happy that your solution works for you. The reality is that the results are likely to be the same when everything is set at default, unless your tool does something silly like not averaging low light measurements like X-rite match, in which case other software is likely to give you better results. But really the software that comes with most calibrators works only when the default settings are what will work for you. This is the sole reason other tools even exist and have a viable market share.

As for your comment of not wanting to put effort into trying something new which (potentially) may be better. I'm disappointed and saddened. Mind you I also hate the words "good enough".
 
You're arguing semantics Buckster, but lets take a look at some of the options that come with popular colour calibration tools shall we?

Case one: Xrite Eye-One Match: I'm given the choice of:
A few selectable set of White balance targets
An input wheel to select gamma curve from 1.0 to 3.0

And that's it. That was short. Right off the bat I don't like it already. What about L* curve or the sRGB curve? When the software is finished it gives me a before and after shot, where's my graph of DeltaE values over brightness changes? No support for my monitor's internal LUT, no easy way to measure results in different locations off the screen to judge backlight bleeding, hell I don't even get a choice to limit max brightness or contrast ratio to meet the requirement of AdobeRGB viewing specs. Even the colour balance is almost fixed. There's a difference between D55 and 5500k which the software doesn't give you the option of, let alone custom entry of a specific kelvin value.

Heck the software that came with my monitor blows away the software that comes with most calibration units. White point calibration by x and y co-ordinate. Calibration to DICOM curves. Heck I can make my own arbitrary gamma curve and load it into the display lookup table.

And it looks like this software is one step even more customisable.

...

I'm happy that your solution works for you. The reality is that the results are likely to be the same when everything is set at default, unless your tool does something silly like not averaging low light measurements like X-rite match, in which case other software is likely to give you better results. But really the software that comes with most calibrators works only when the default settings are what will work for you. This is the sole reason other tools even exist and have a viable market share.

As for your comment of not wanting to put effort into trying something new which (potentially) may be better. I'm disappointed and saddened. Mind you I also hate the words "good enough".
It's really simple: I just want to know if the resulting differences are detectable by human eyeballs or not.

I don't care if it samples 10 times or a million times, if the difference between them is so slight that it takes a super-computer with a $100,000 array of sensors to "see" the difference.

If it's not a difference that human eyeballs can even see, why should anyone give a snit that it's supposedly "more accurate? How will that make any real-world difference?

If it IS a difference that human eyeballs can see though, that should be something that can be shown, measured, quantified, explained without too much fuss. So just do it, instead of giving me lip about why I'm questioning it.

Just show me. Give me a real reason to DL it, install it, get to know it and love it and hug it in my dreams.

Is that really such an outrageous request?
 

Most reactions

New Topics

Back
Top Bottom