Screen Calibration Question

And what I'm getting from people in this thread so far is that nobody has ever heard of such a thing as monitor makers attempting to make monitors to any particular standard, certainly not to the printer's photo standards. So almost certainly, then, the reality is something like this:
$calibration3.jpg
(The blue is probably even closer to a single line with almost no variation)

And there is no one solution. If I want the maximum number of people to see my work as I see it, I have to have one monitor at each calibration, or software that flips my monitor back and forth. And then edit the photo based on where I intend to publish it, in paper or online, using whichever calibration matches.

Assuming I give a crap enough to bother. Which I DO, because it sounds like a fun project in and of itself =D
 
I think pretty much all you can do is use sRGB throughout the whole process and know you equipment through constant use. Have two PCs: for web and for print, and optimize both for purpose. The web system is calibrated, and surfing the premier commercial websites confirms it's set-up right. The print system is calibrated for that job, and your prints look, in every possible way, "very close" to a monitor image. Or have two monitors? Both are calibrated for each task, both are subjectively compared to either a range of 'good' websites or your prints.
 
Your monitor can't be on both peaks at once, but it can be on both curves twice.

I don't make monitors. I don't know what the hell they do at the monitor factory.

One would think that it would be in their best interest to standardize. That they have chosen not to tells me that they know something we don't.

:lol:
 
You again are over thinking it. Yes there is a standard but what you are trying to do is impossible. Changing the brightness, hue, contrast, gamma, individual color settings all play a part in the perceived color.

If you want an average find the most popular monitor and keep the default settings.

Fun fact, one of my sites gets millions of visitors a year and the most popular resolution by a mile is 1024x768 and most still use 24bit and shockingly quite a few 16bit and even 8bit.
 
The web system is calibrated, and surfing the premier commercial websites confirms it's set-up right.
The premier websites thing is a good idea, but is only crudely useful. All it tells me is that my monitor is not WILDLY inappropriately different than everybody else's. As in, I'm not seeing orange where you see yellow or whatever.
It doesn't tell me anything about whether that particular shade of blue is as the artist of the graphic in the New York Times article intended, or not, etc.
I'm interested (even if just for fun) in trying to see if i can get down to the specific shade as closely as is possible on average by mapping a bell curve out around town.

One would think that it would be in their best interest to standardize. That they have chosen not to tells me that they know something we don't.
There are lots of reasons why they might not, which wouldn't really matter to me for this project (since I can't control any of the reasons). For example, Mac OS uses a lot of shades of gray by default, so they might set lower contrast on their apple monitors than Dell does (who mostly makes PCs), so that you can see prettier, smoother gradients of gray in all their gunmetal menus and stuff.

If you want an average find the most popular monitor and keep the default settings.
http://www.purplemath.com/modules/meanmode.htm
Mean =/= Mode.
Both are measurable. I want mean.

Although sales figures for the few most popular monitors would certainly be a useful step along the way (then I only have to measure one of each that I know the owner hasn't touched from factory settings, and do a weighted average)
 
I'm interested (even if just for fun) in trying to see if i can get down to the specific shade as closely as is possible on average by mapping a bell curve out around town.
You still have to put a sensor in front of the monitor to do that.

So buy yourself a Huey (or whatever), download the software, the do your survey.

Of course, you will have to install the software on every computer you measure - and the measurements, depending on settings used, can take a while to complete. Anywhere from 5 (minimal "quality" settings) to 60 (maximum) minutes.



There is no way to know without taking the measurements yourself, and you can't do that with software.
 
Is the graphic not clear? The reason I'm concerned about doing it is that there is a chance that calibrating for print might push me further OUT of calibration with internet viewing.

Your graphic is just guesswork because you don't actually know where on the graph your monitor is. If you calibrate you know your monitor is set to a standard, and thus should be at the top of the hump of your graph.

There is no such thing as calibration for internet viewing.

By calibrating your display you will know that your image file looks like you want it to look. You cannot control how everyone else sees your image so give up on that. But you can control how you see your image and how others with calibrated screens see your image.
 
Of course, you will have to install the software on every computer you measure
I was planning on just carrying a VGA adapter around with my laptop and plugging in monitors to it temporarily.
I would have to install for other people's laptops, but whatever.

Your graphic is just guesswork because you don't actually know where on the graph your monitor is. If you calibrate you know your monitor is set to a standard, and thus should be at the top of the hump of your graph.
The BLUE hump yes. Not the red one, which is actually more important to me since I put more stuff online than in print.

There is no such thing as calibration for internet viewing.
Sure there is. There is a mathematical calibration average out there in the universe. Whether anybody bothers to measure it is a different question, but it does EXIST to be measured. Which if I have to, I will do myself (roughly, but better than knowing nothing)


You cannot control how everyone else sees your image so give up on that.
Sigh.
Why are so many people drawing an arbitrary distinction here between calibrating to print vs. internet? it is the exact same logic.
Guess what? I can't control what printers machines use either! Does that mean I should "give up on" calibrating to printers? No... of course not. What I do is control MY machine to match THEM.
Same thing for the internet. I don't change the world's monitors. I change MINE to match the average of THEIRS.
 
Are you up for a fool's errand, Gav???? THAT was exactly the point of my little Cartoon Gavjenks throwing his hands up in frustration. Other threads, both here and on dPreview, have brought up the fact that there is a positively HUGE range of monitors and operating systems in use, as well as different types and or kinds of displays...there are super monitors, excellent monitors, average monitors, and then there are those like the "Mom's basement computer monitor", as well as dinosaurs like the ancient icons-burned-in NEC 15-incher like my buddy Scott STILL HAS on an ancient Win-Doze 98-equipped Dell system that he keeps his lawn and landscape business books on...OMFG, that thing is an utter joke...his kids tease him about it...I razz him about it... "Just one more season, that's all it needs to make it through," he told me last month.

Even remotely hoping to come close to what "the Internet peeps see" is pushing it. There are people with their brightness levels set to MAX, as well as people whose monitors are so dark that everything looks like film noir.

In addition to that you also would need to take the lighting conditions of the room each different monitor is in. The ambient lighting has an effect on how we see things. I use a spy per to calibrate and it is constantly tanking ambient light readings as long as it is left plugged into the computer.
 
I don't NEED to account for lighting, or people's psychology or eyeballs.
Yes, these things matter and they affect perception, but I can still get closer to accurate by addressing monitors only.
it's not an "all or nothing" situation.

Keep in mind that lighting conditions also matter for prints, yet you don't see anybody saying "Oh well, we can't measure room lighting for my client hanging this print, so I may as well give up and not even try to calibrate for printing at all."


Yeah, if you are Frank Lloyd Wright and you are micromanaging every little bit of furniture and decoration in a house, then go ahead and calibrate the photos on the wall with the room's lighting. otherwise, just calibrate for the print itself, and you won't be perfect, but you'll be closer to perfect than if you did nothing. Same goes for monitors.
 
Sure there is. There is a mathematical calibration average out there in the universe. Whether anybody bothers to measure it is a different question, but it does EXIST to be measured. Which if I have to, I will do myself (roughly, but better than knowing nothing)

And because it has not been measured means you don't know what it is and there for practically speaking there is no such thing.

Why are so many people drawing an arbitrary distinction here between calibrating to print vs. internet? it is the exact same logic.
Guess what? I can't control what printers machines use either! Does that mean I should "give up on" calibrating to printers? No...

I know I'm not referring to calibrating to print and I don't think most others are. I'm simply referring to calibrating you're display so that what you see is an accurate representation of the file.
 
I was planning on just carrying a VGA adapter around with my laptop and plugging in monitors to it temporarily.
That should work. Surprised I didn't think of it, lol.


There is no such thing as calibration for internet viewing.
Sure there is. There is a mathematical calibration average out there in the universe. Whether anybody bothers to measure it is a different question, but it does EXIST to be measured. Which if I have to, I will do myself (roughly, but better than knowing nothing)
When you calibrate, you're not calibrating to "the internet" or "the printer" - you're calibrating to the sRGB standard (or whatever other color space you choose). Average does not equal calibration.
 
people whose monitors are so dark that everything looks like film noir.

..hehehe good one. I searched for a fair-use Blofeld (Charles Gray) picture to post, but nothing doing. That's my monitor at 05:00 AM btw.
 
I'm simply referring to calibrating you're display so that what you see is an accurate representation of the file.
sRGB is pretty useless to me, because regardless of how they are labeled, monitors aren't usually accurate to sRGB. I care about showing files as they are actually most commonly seen, not how it is "supposed to be" but not actually IS. In other words, if 90% of people view so-called "sRGB" files under incorrect conditions for what sRGB is supposed to be, in the direction of being too red, then I want my monitor to also be too red compared compared to what sRGB files should be.

So calibrating to sRGB is not going to accomplish my goals. It only calibrates me to a world where everybody actually correctly uses sRGB. Which on Earth, happens to be otherwise known as "the printing industry" (and associates like some photographers and maybe other industries I dunno)

When you calibrate, you're not calibrating to "the internet" or "the printer" - you're calibrating to the sRGB standard (or whatever other color space you choose). Average does not equal calibration.
If your defined absolute colorspace happens to overlap with the average, then yes it is the same thing. And since I am defining my desired colorspace AS the average of internet users' actual monitors and viewing conditions (the tool should take into account room lightings), then when i calibrate to my new colorspace, I will be also bringing myself to the average. They will be the same thing.

Call the new colorspace "aRGB" if you like. (averageRGB). It is identical to sRGB, but shifted linearly some amount based on the difference between sRGB and the average of what actual internet users have.

aRGB is not a distribution. It is one specific absolute colorspace and viewing condition, just like sRGB. It just happens to lie precisely at the average of actual internet users' conditions.
 
Last edited:

Most reactions

Back
Top