I think that proves that a good calibration makes what you see in Photoshop and the web match, as it should. Flash is not color managed today, so even if LR is attaching a profile, it's not being used by Safari.
It can but there's no guarantee.
Now things get interesting when I view my Flash galleries on a wide gamut display (93% of Adobe RGB (1998)). FireFox and Safari match. This probably indicates that the upload from LR doesn't contain a profile. They don't match LR there's a difference in saturation but its not a lot (which I wouldn't have expected). In fact, at least with some of the images I compared, I don't personally mind what I'm seeing on the web pages. A bit more Velvia look to them but nothing I mind. Hue seems to be well maintained, its saturation that is altered. I'm viewing the shot of the man washing his green boat. Adobe RGB (1998) is larger in green primary gamut than sRGB (that's its biggest difference). Looks fine to me. This display is calibrated far from what we'd be calling "sRGB" both in terms of its gamut, luminance etc.
The 23" Cinema display, calibrated and profiled to native gamma and white point looks identical in the three applications. This may be a good point to calibrate such displays to (native/native) where no LUT is being affected.
Unfortunately, not all calibration packages provide this.
We have an (original) LAB image, sRGB and Adobe RGB (1998) image with lots of colors.
I see a difference between Firefox and Safari (again, its saturation). Safari matches LR previews on a standard sRGB calibrated and profile display and on a wide gamut display. FireFox doesn't on either.
Note on viewing. Ideally when comparing images, you should do so at 100% (1:1), especially when dealing with Photoshop. Not really possible here due to the size of the web images. But at the very least, attempt to size the comparisons to the same size on screen. Or download the full rez images and resize to match the web.
I ran some color calibration tests...
I did a bunch of measurements today to see how my monitor behaves in Firefox, Safari and Photoshop with a bunch of different monitor profiles. These are the things I was hoping to gather some info on:
How do Safari, Firefox and Photoshop compare with different calibration profiles?
How does the color temperature setting in the calibration process affect the colors that are displayed in each app
Can I learn anything that might help mitigate the difference between Firefox and Photoshop?
Which color temperature setting is the best for my monitor if my objective is to see accurate color (not a representation of printed output)?
Summary of procedure
Get a color chart (once with somewhat standard blocks of color on it)
Make sure it's in sRGB and tagged that way
Resize the image so three copies of it fit at 100% on my screen
Do a particular color calibration of my screen (details below for which ones I did)
Load the image into each app
Position all three apps so I can see all three at the same time
Use my camera as a poor mans color measuring tool by taking a picture of the screen
Load that picture into Photoshop and use the eye dropper to read the color values off 6 different color swatches in the image
Record all those values in both sRGB and LAB (using the info palette in Photoshop)
Calibrate the screen for the next color temperature
Reload all the apps
Take picture, repeat process for each color calibration
Put all results into a spreadsheet
Decide what conclusions I can draw from the data
Write up results
Conclusions
Here are some of the conclusions I can confirm:
The detailed color measurements are in this table if you want to see them yourself. I personally find it easier to look at the LAB numbers.
The color temperature setting does indeed affect how a calibrated, profiled screen looks in Photoshop so this color temperature number is very important if you want to see accurate colors
Safari and Photoshop display very similar colors when viewing this ICC profile tagged image (easily within the margin of error of my measurements)
Firefox varies signficantly from Safari and Photoshop (no surprise here)
Firefox displays quite differently after calibrating/profiling than before and how it displays depends a lot on the color temperature number you pick for the calibration process. Since Firefox is not using my monitor profile for monitor compensation (like color-managed apps use), that means the Eye One Display 2 must be changing the default monitor display (probably with LUT values) according to the color temperature setting.
I saw some small luminosity fluctuations between swatches that were otherwise nearly the same color. I don't know exactly why that is, but it could be explained either by slight fluctuations in the backlighting on my monitor or by some light fall-off with my lens.
With this set up, I am unable to assess absolute color accuracy, but I can measure differences between the three different apps. This is because of unknown white balance settings and uncalibrated RAW converter color interpretation. But, the measurements are consistent so they can be used to look at color differences between scenarios, just not absolute color accuracy.
I see the smallest difference between the Firefox colors and the Photoshop colors in either the "native" calibration of the 6000 degree color temperature calibration, though there are still significant differences. The whites/grays are pretty close in Firefox at this temperature, but most of the other colors are significantly different.
There's no good setting that can make skin colors in Firefox get close to Photoshop. I picked the pink and peach color swatches because they were in the same neighborhood of skin color, but they are always different - consistently showing similar luminosity, but higher numbers in both A and B channels.
Looking at the 6000 degree calibration as the one I will probably use, we see that in all swatches measured, Firefox shows A and B values further from zero (more positive and more negative). It's as if, the monitor has "cranked up the color" and the calibrated view through Photoshop has brought it back to reality. This might explain why some people like the Firefox view better than the Photoshop view. It appears to have richer colors.
"Native" for my monitor is almost identical to the 6000 degree calibration. I assume this is monitor dependent.
Details of the experiment
Monitor: HP LP3065 (30" LCD).
Windows OS, Vista 32-bit Home Premium
Gretagmacbeth EyeOne Display 2 USB calibrator puck and software
NVIDIA GeForce 8600 GTS video card
In more detail, here's what I did. I started with an image that has a color chart in it. Andrew has one linked from his site on this page, so I started with that one. It was in the colormatch colorspace so I converted it to sRGB and then downsized it so three simultaneous versions of it would fit on my screen at 100%. Here's what a small version of the image looks like:
While there are lots of colors on this that I can look at to subjectively evaluate, I decided to focus only on the solid colors in the color patch below the photo of the lady as those are easiest to consistently measure with the eye dropper and info palette in Photoshop.
Then, I set a particular color calibration on my screen. For the "uncalibrated view", I used the HP monitor profile that comes with my monitor which is presumably some factor supplied standard monitor profile. For the other color temperatures, I did a full screen calibration using the Eye One Display2 system that I have. I tried to place the puck in the same part of the screen for each calibration. I made the room as dark as possible and even shielded the screen from any stray light through the blinds. I tested all of the following color temperature settings in my color calibration software:
Native
5000 (fairly warm)
5500
6000
6500 (cooler)
Then, after setting a particular color calibration, I loaded the test image into each of the three apps. I positioned the apps on the screen so that I could see all three at once.
With my Nikon D2xs and 105mm macro lens on a tripod, set for manual exposure (so the exposure would be exactly the same for all photos), set for fixed sunlight white balance (so we'd have a consistent WB setting for all photos), I took a picture of the screen in RAW.
I noticed that if I focused on the screen to make the image sharp, I would get a lot of moire. This is likely because the pitch of the pixels on the screen is somewhat close to the pixel pitch on my sensor at the distance I was at. I figured the moire would be a really bad thing for consistent color measurement so I decided to just defocus the image slightly. This makes the moire go away (because it blurs the screen pixels to a larger size so they no longer match the sensor pixel pitch) and, as long as the blur isn't too bad, it shouldn't really affect a color swatch measurement in the center of the swatch. I thought about doing the blur in post processing, but decided it was better to just never have moire in the image in the first place.
Here's what one of those pictures looks like (purposely slightly out of focus). Firefox is on the left, Safari in the middle and Photoshop on the right:
I repeated this process for each of the different color calibrations, so that I ended up with 7 test images, each with a different monitor profile. The resulting test images after converting to JPEG are all here.
I then loaded the RAW file into ACR, turned off all auto adjustments so all images got the exact same settings in ACR and opened them one at a time in Photoshop. In Photoshop, I took color readings using the eye dropper from the image in Photoshop off six different color swatches for each app and I recorded the numbers in both RGB and in LAB (just using the info palette) in a big table. That's 18 color measurements, recorded in both RGB and LAB, for each image. I realized that the LAB values would be useful for seeing differences in luminosity separately from color.
By looking at the numbers in the table, I can see that the 6000 degree numbers are very close to the native numbers. I think that means that my monitor is natively around 6000 degrees.
In this test, it is possible to get the neutrals in Firefox and Photoshop to line up pretty well (~6000 degrees), but even when they line up, the other colors are off. I think this proves that the monitor does indeed need calibration and the calibration that it needs is not as simple as uniform color shift.
There are so many numbers in this table that I was trying to figure out if there was some graphical way to represent it to perhaps see some sort of trend that it's hard to see in the numbers, but I didn't come up with any immediate ideas for what to show. I also made the mistake of entering each color triplet into a single cell in the spreadsheet so it would take some busy work to change the numbers so they could be graphed. I did wonder about trying to quantify the color differences between Firefox and Photoshop with some sort of formula based on the differences between the color patch measurements, but I did try it yet.
In every single case I looked at, the Firefox numbers were "more saturated" than the Photoshop numbers. I'm guessing that's because more saturated monitors sell more so they crank it up in the design of the monitor. The monitor compensation that Photoshop does actually serves to "dull" the colors a bit. I believe it's probably doing the accurate thing, but I can now see why some people say they prefer the Firefox version of the image over the Photoshop version. How much this happens is presumably monitor-dependent, but I wouldn't be surprised if it's a common industry norm.
The LAB numbers are a lot easier for me to draw conclusions from because you can easily see luminosity separate from the A and B color channels. If you aren't familiar with the LAB color space, then it's probably worth a quick read of this Wikipedia article and there are many other articles on the web if you search on Google.
Comments? Questions? Any other conclusions you can draw from the numbers?
Yes, the color temp of a calibration will absolutely affect the previews (that is being applied in the LUT). The profiles uses this too.
6000K may be ideal for your display. Don't know if that's true for every display. The luminance will likely alter this sweet spot too. And I'd refrain from using kelvin from calibration and instead focus on the standard illuminants as at least you're taking about a single color.
Unless I missed something, I'm not sure how you got the Lab values (did you measure with the colorimeter?).
The native white point of various displays might be all over the map.
I am going to spend some quality time with this over the weekend (along with the other links you referenced), but for the moment I've been swept into the world of video formats.
John, you realize we have a beeg problem to face in the next few days, and it's your fault (well, you and Andrew)? We're converting our built-in slideshow from Javascript-based to Flash by popular demand. So what's gonna happen is you'll be browsing your galleries in Safari and because we now attach profiles to all images (unlike, for example, the International Color Consortium, Apple and Adobe) you'll see your photos in all their glory (unlike, for example, with Andrew's Flash-based galleries).
And then you'll click the slideshow button and presto, the color will shift because now your photos are displayed in Flash, which doesn't know for color management.
And our customers are not gonna blame Adobe, Apple, John and Andrew... The shrill emails coming to our support heroes are gonna say, "SmugMug wrecks my photos!" :cry :cry :cry
I am going to spend some quality time with this over the weekend (along with the other links you referenced), but for the moment I've been swept into the world of video formats.
John, you realize we have a beeg problem to face in the next few days, and it's your fault (well, you and Andrew)? We're converting our built-in slideshow from Javascript-based to Flash by popular demand. So what's gonna happen is you'll be browsing your galleries in Safari and because we now attach profiles to all images (unlike, for example, the International Color Consortium, Apple and Adobe) you'll see your photos in all their glory (unlike, for example, with Andrew's Flash-based galleries).
And then you'll click the slide show button and presto, the color will shift because now your photos are displayed in Flash, which doesn't know for color management.
And our customers are not gonna blame Adobe, Apple, John and Andrew... The shrill emails coming to our support heroes are gonna say, "SmugMug wrecks my photos!" :cry :cry :cry
I was hoping to uncover more useful conclusions in the testing, but at least I've verified a bunch of things that were talked about and I think I understand what's going on much better now. The two big misunderstandings that I've run though in the last 6 months are:
1) Calibrating your screen doesn't do much calibrating at all. It mostly profiles and it's up to a color-managed app to use the profile to make accurate colors. So, calibrating doesn't really help Firefox/IE at all. Ask 20 people who bought a color calibrator in the last year and I bet 19.99 of the 20 think that their screen will display more accurate color even in Firefox and IE after calibrating it.
2) These calibration/profiling tools have a design center around matching a printed page and that's why they make you pick a color temperature. They do not have a design center around giving you accurate emissive color so your screen will look the same as mine when we both use color-managed tools. If we happen to coordinate and pick the same color temperature, they should come close, but if we don't or don't even know each other, they won't match. I hope this industry offers tools for "web viewing" that are centered around color accuracy for web viewing rather than color matching to a print in your specific lighting. That's probably the larger market going forward as the market for color calibration goes mainstream.
On all the color issues, The big lesson to me here is that for the majority of the public (which is not very educated about color stuff), color differences between the same image displayed two different ways are way, way more noticeable than color accuracy issues. Thus a difference between Photoshop and your browser is way more visible than both being the same, but wrong. When you're striving for color accuracy and the tools your viewers are using to view the web are only half way there, what a pain! I feel for the predicament. I guess you'll just have to take a few scars for heading this way now and the tools will catch up over time. The only other choice is to wait until the entire installed base of tools gets upgraded over time and that would be a long ways out, so you've probably got to get in before everything is there anyway. I feel a big color FAQ coming.
Any idea what the ETA is for color-management in Flash? I've heard that it's coming sometime and I can certainly see how it's in Adobe's interest to do so, but I've not heard when.
Firefox displays quite differently after calibrating/profiling than before and how it displays depends a lot on the color temperature number you pick for the calibration process. Since Firefox is not using my monitor profile...
John, I read your references for calibrating & profiling and they made tremendous sense. I don't think anything surprised me but they created a lot of clarity.
I'm not sure I understand your twin statements about FF displaying quite differently after calibrating/profiling and it not using the monitor profile. Can you explain? I'm assuming you mean FF doesn't but the OS does. Else why would the net result be different?
Are we discounting Gary Ballard's views? I ask because the Aperture team and Apple's pages reference him on these issues, and his views and mine are very close (but presented with compelling data, I'll change).
His pages are long and multi-topic so I'll extract the key points to make this easier:
"Obviously there is nothing we can do about all the computers in the real world that use bad monitor profiles on uncalibrated systems, but good 2.2 gamma, 6500/D65 monitors will display sRGB with the most faithful appearance with the best, truest color possible."
"If untagged sRGB displays incorrect color in Safari or FireFox, I would say you have a bad monitor profile or bad operating system, software bug or odd approach to monitor profiling."
"If Photoshop working color is set to sRGB and the monitor profile was created for 2.2 gamma and 6500K, color shifts in Save For Web result from a bad monitor profile."
The catalyst for this thread was that Andy and Steve calibrated/profiled their monitors and after doing so untagged sRGB images looked worse (oversaturated). It's hard for me to understand how that sacrifice has to be made to make Photoshop accurate. Photoshop has nearly unlimited latitude to change the image.
If Eye-One's shoddy code or cheap spectrometer is creating a display profile that doesn't cause the viewer to see accurate color, what's to stop us from using the photo spectrometer I own, displaying a digital Gretag color chart of known RGB values, and writing our own profiles? Then we can check the thing we haven't checked yet: are we creating accurate calibration/profiles?
I'm not sure I understand your twin statements about FF displaying quite differently after calibrating/profiling and it not using the monitor profile. Can you explain? I'm assuming you mean FF doesn't but the OS does. Else why would the net result be different?
Here we go again.
ICC aware uses profile to preview. Non ICC doesn't. What you should be assuming is FF doesn't use profile, Safari does, not the OS.
The assumption is, if you calibrate a display, that's a fixed preview for all applications but of course its not. Calibration affects all applications but ICC profile compensation only affects some. So its perfectly normal that the two don't match.
John, I read your references for calibrating & profiling and they made tremendous sense. I don't think anything surprised me but they created a lot of clarity.
I'm not sure I understand your twin statements about FF displaying quite differently after calibrating/profiling and it not using the monitor profile. Can you explain? I'm assuming you mean FF doesn't but the OS does. Else why would the net result be different?
Are we discounting Gary Ballard's views? I ask because the Aperture team and Apple's pages reference him on these issues, and his views and mine are very close (but presented with compelling data, I'll change).
His pages are long and multi-topic so I'll extract the key points to make this easier:
"Obviously there is nothing we can do about all the computers in the real world that use bad monitor profiles on uncalibrated systems, but good 2.2 gamma, 6500/D65 monitors will display sRGB with the most faithful appearance with the best, truest color possible."
"If untagged sRGB displays incorrect color in Safari or FireFox, I would say you have a bad monitor profile or bad operating system, software bug or odd approach to monitor profiling."
"If Photoshop working color is set to sRGB and the monitor profile was created for 2.2 gamma and 6500K, color shifts in Save For Web result from a bad monitor profile."
The catalyst for this thread was that Andy and Steve calibrated/profiled their monitors and after doing so untagged sRGB images looked worse (oversaturated). It's hard for me to understand how that sacrifice has to be made to make Photoshop accurate. Photoshop has nearly unlimited latitude to change the image.
If Eye-One's shoddy code or cheap spectrometer is creating a display profile that doesn't cause the viewer to see accurate color, what's to stop us from using the photo spectrometer I own, displaying a digital Gretag color chart of known RGB values, and writing our own profiles? Then we can check the thing we haven't checked yet: are we creating accurate calibration/profiles?
Let me try to explain. When you calibrate your display, there are potentially two things going on (and I see both of these happening in the measurements of my system).
1) The default display colors are changed.
2) The resulting screen is then measured and a monitor profile which describes it's behavior is created and then put in a well known system location.
There are a couple of ways that the first item is accomplished depending upon your technology. The way that I understand is that a "look-up table" often called a LUT is created and given to the video driver. I don't understand the mechanics of this, but it happens at the OS/driver level. This will change the colors the monitor produces when fed a given RGB value. That means that non-color-managed apps like Firefox are directly affected by this. Unfortunately, these LUT changes aren't really fine calibration settings. They don't successfully "calibrate" your monitor for non-color-managed apps. I don't really know this for sure, but I think all they are trying to do is to get your monitor in the right ballpark so the the corrections specified in the monitor profile are more managable and not too far off.
Then, once these LUT values are put into place, the resulting screen is measured for accuracy and those results are recorded into the monitor profile. You can think of the monitor profile as an "color error map" as it can be used to understand the existing color display errors in the default display. A color-managed app then grabs the monitor profile and uses that to figure out how it must change the numbers it sends to the screen in order to get the actual color it wants.
I'm not sure I understand your twin statements about FF displaying quite differently after calibrating/profiling and it not using the monitor profile. Can you explain? I'm assuming you mean FF doesn't but the OS does. Else why would the net result be different?
The LUT values set in item #1 above cause the screen to change what color it displays for a given RGB value. This means that non-color-managed apps like Firefox will have their colors affected with the LUT values. You can see in the actual Firefox color measurements I took that indeed, it displays different colors for each color calibration even though it's not using the monitor profile. Since Firefox is not color-managed, it isn't doing anything differently across these different calibrations/monitor profiles. It's the LUT values that are set differently and causing the display to show different colors.
"Obviously there is nothing we can do about all the computers in the real world that use bad monitor profiles on uncalibrated systems, but good 2.2 gamma, 6500/D65 monitors will display sRGB with the most faithful appearance with the best, truest color possible."
We are all not finding this true in practice. The monitor's Andy calibrated didn't show this to be true and no calibration I did on mine (and I've tried it a whole bunch of different ways) found this to be true. Mine's not far off, but skin-tones are noticably different. For this to be true, one of two things would have to be true.
1) Either the native monitor color would have to be very close to sRGB.
or
2) The LUT calibration would have to be able to make the monitor produce very good sRGB such that when it's fed a given RGB value (anywhere in the color spectrum), it produces the right sRGB color on the screen.
The color numbers in my measurements show this not to be the case for my configuration. My monitor produces oversaturated colors when uncalibrated and at all calibration temperatures. I don't know if it's possible to use the LUT to make the default display match sRGB better or not, but my software is not doing it.
"If untagged sRGB displays incorrect color in Safari or FireFox, I would say you have a bad monitor profile or bad operating system, software bug or odd approach to monitor profiling."
That's what I thought would be the case when I first started calibrating my system, but practice doesn't show that to be true. As in the previous section, I'd like to know if it's possible for the LUT calibration changes to make the default display more like sRGB or not, but regardless my Eye One Display 2 software is not doing that. This is not theory here, just actual measurements.
"If Photoshop working color is set to sRGB and the monitor profile was created for 2.2 gamma and 6500K, color shifts in Save For Web result from a bad monitor profile."
Same argument as the previous two sections. This assumes that the default display has been "calibrated" to be close to sRGB either because it came that way out of the box or with the LUT changes. None of ours are that way after calibration. I wish it was true, but in practice we aren't find it to be true.
The catalyst for this thread was that Andy and Steve calibrated/profiled their monitors and after doing so untagged sRGB images looked worse (oversaturated). It's hard for me to understand how that sacrifice has to be made to make Photoshop accurate. Photoshop has nearly unlimited latitude to change the image.
The default display (what non-color-managed apps display) is clearly being changed when we calibrate out displays. You can see that from the color numbers in my measurements. Just look at the Firefox color values for the Peach color. They are significantly different at each color temperature and all different from the unprofiled setting. My screens is oversatured by default and it stays that way at each calibration setting. I don't know whether I can conclude that the Firefox view looks "worse" after calibration or not. I know that it's different and no more accurate than it was before. It seems like it would be possible to have calibration software that did no LUT management at all and thus didn't change your default display at all, so Firefox was completely unaffected by the profiling process, yet still created a completely accurate monitor profile so color managed apps could apply the right correction and display accurate color. This should be theoretically possible if the monitor wasn't natively too far off. If we could get the right software engineer who knows how the Eye One Display 2 software works in a room for two hours, we could get those answers.
If Eye-One's shoddy code or cheap spectrometer is creating a display profile that doesn't cause the viewer to see accurate color, what's to stop us from using the photo spectrometer I own, displaying a digital Gretag color chart of known RGB values, and writing our own profiles? Then we can check the thing we haven't checked yet: are we creating accurate calibration/profiles?
As I hope I explained above, the monitor profile doesn't change the screen for Firefox at all. It's the LUT values in the video driver that do that. With different software design goals, it might be possible to create calibration software that would set the LUT values to make the default display get as close to sRGB as possible so that non-color-managed apps had much more accurate color. I don't know for sure. I'll do some searching on LUT calibration and see if I can learn anything. One thing I would expect is that if you tried to make LUT calibration produce sRGB, you might be squashing some of the colors that the monitor can produce. Certainly if this was done for a monitor capable of producing most of AdobeRGB, you'd be ruining that with a LUT that squished it down to sRGB.
1. Calibration alone isn’t enough. That's WHY we have the ICC architecture we do where profiles are necessary for the Display Using Monitor Compensation process. We've seen this historically. The ONLY displays I've ever seen that can be calibrated to visually identical spec's are true reference displays. Radius Pressviews calibrated to ColorMatch RGB, Barco Reference V etc. We're talking (in 1990 dollars) units that cost $3-5K.
2. No amount of calibration makes a unit exactly into an sRGB device. Lets recall that sRGB is a theoretical color space based on a specific device but not necessarily on a real device. Its probably possible to take an old CRT display with P22 phosphors and place it into the surround specified as sRGB (the ambient light conditions in terms of intensity and color) and make it mimic very closely sRGB. Your modern LCD isn't going to do this. Its been designed to get "close" and it takes a lot going on internally for this to happen. We must have a profile that defines the current device behavior and an ICC aware application to produce the visual effect of sRGB on differing units.
Bottom line is, no matter how you calibrate that modern display, it might get close but not exactly produce sRGB. Just look at the rather huge differences in the light source of the two emissive technologies and I think you'll see how difficult this is. Also look at the huge differences in the maximum lumanice of even a new CRT (maybe 100cd/m2) and a new LCD (many can easily hit 300). The sRGB spec had no idea these devices would exist when designed (again solely from simple math).
The only way to get multiple dissimilar displays to produce the same color appearance is to fingerprint their conditions using a profile and let the profile compensate on the fly in the application. Or expect every user to have a true reference display. But getting close outside ICC aware applications isn't necessarily a bad idea UNLESS you've got to screw around with the 8-bit LUTs to the degree you move very far from the native condition, then the result is more banding on-screen. Newer units with high bit internal adjustments over come this. But then, ask your self if forcing a non sRGB behaving display into an sRGB behavior is a good idea? Usually not. And if you're talking about newer units that exceed sRGB gamut, forcing them into sRGB is even more pointless.
We need to move away from sRGB as some kind of useful, well defined standard. Its not 1993! And even more shocking is that of all the digital imaging technologies since then (cameras, scanners, processors, software), the one technology that hasn't moved anywhere as quickly is display technology. That's changing (slowly). sRGB is a dinosaur.
A little more info on LUT values
OK, a little more info on the LUT. The LUT values can be stored in the monitor profile, but are acted upon by external software (either in the OS or something that runs at startup) and applied to the video card. One source I've read suggests that the LUT values are used by most software to get the monitor to a proper gamma and not used to calibrate the color.
I've found several other discussions that say that you get the best color reproduction in color-managed apps if the LUT value changes are minimized. This happens if you set your calibrator for native color temperature.
If you start manipulating color in the LUT too much, you start dminishing the colors that can be produced on the monitor and this can also lead to banding (if you lose the ability to produce certain colors on your monitor). Part of this is because most LUTs are 8-bit and multiple 8-bit manipulations on an 8-bit display starts to cause rounding errors which leads to errors in the lowest bits which manifests itself as banding.
I've also found a whole slew of articles that say that Windows Vista has all sorts of problems with losing the LUT values once they've been set. Here's one particular article. Apparently they can get lost after your screen times out and gets turned off (if you have it set that way) and whenever Vista dims your screen to ask for permission to do something. What a mess! Maybe this will get fixed in the Vista update due this quarter.
BaldyRegistered Users, Super ModeratorsPosts: 2,853moderator
edited January 26, 2008
I pinged Gary Ballard this a.m. but dgrin's anti-spammer artillery kept him at bay so he asked me to post this:
Chris,
I don’t know anything about the rocket science, but the short answer is the ‘problem’ monitor hardware displays higher gamun that the sRGB space, hence the saturation boost when the monitor profile is applied/assigned/assumed to sRGB.
The issue is very easy to nail in Photoshop:
1) Convert to sRGB in Photoshop
2) go to View> Proof SetUp: Monitor RGB
That should show the saturation boost you are seeing exactly as untagged sRGB will display in Mac web browsers.
I think calling the monitor manufacturer and the hardware calibrator tech supports and asking them WHY this is happening is more time productive…this exact problem was beat to death in the Photoshop forum recently http://www.adobeforums.com/webx?14@@.3c053066/0
I pinged Gary Ballard this a.m. but dgrin's anti-spammer artillery kept him at bay so he asked me to post this:
Chris,
I don’t know anything about the rocket science, but the short answer is the ‘problem’ monitor hardware displays higher gamun that the sRGB space, hence the saturation boost when the monitor profile is applied/assigned/assumed to sRGB.
The issue is very easy to nail in Photoshop:
1) Convert to sRGB in Photoshop
2) go to View> Proof SetUp: Monitor RGB
That should show the saturation boost you are seeing exactly as untagged sRGB will display in Mac web browsers.
Do you understand what View>Proof Setup: Monitor RGB does? Cause it doesn't solve any of the issues we've discussed.
This setting tells Photoshop to use it's ICC aware capabilities to view an sRGB (in this case) as if Photoshop were a dumb web browser. Its doing an sRGB to Monitor RGB (YOUR specific monitor) conversion. The only thing this soft proof setting is useful for is showing you how the sRGB doc would look if you viewed it on YOUR machine in a non ICC aware application on your machine. But you could do this by simply opening the sRGB document in IE on your machine. But Photoshop wants to provide a soft proof of this for you while you're in Photoshop.
No one but you will see this preview. Photoshop has no way to understand what any other user's display is doing (certainly without that display and a profile for it). So you're back to square one.
I pinged Gary Ballard this a.m. but dgrin's anti-spammer artillery kept him at bay so he asked me to post this:
Chris,
I don’t know anything about the rocket science, but the short answer is the ‘problem’ monitor hardware displays higher gamun that the sRGB space, hence the saturation boost when the monitor profile is applied/assigned/assumed to sRGB.
The issue is very easy to nail in Photoshop:
1) Convert to sRGB in Photoshop
2) go to View> Proof SetUp: Monitor RGB
That should show the saturation boost you are seeing exactly as untagged sRGB will display in Mac web browsers.
I think calling the monitor manufacturer and the hardware calibrator tech supports and asking them WHY this is happening is more time productive…this exact problem was beat to death in the Photoshop forum recently http://www.adobeforums.com/webx?14@@.3c053066/0
G BALLARD
Two things could explain why this problem seems to have gotten more prominent lately. First, LCDs have taken over the world and most of them really can't be color calibrated much at all (they get profiled for color-managed apps). CRTs could get calibrated.
Second, some of the newest generation of monitors is a wider gamut than the oder generations which, Gary explains, causes more of a mismatch when the monitor is assumed to be sRGB (because the monitor is actually further from sRGB than older ones).
This is, of course, only an explanation for what's happening and still offers no new ideas on how to set things up to minimize the difference between non-color-managed Firefox/IE and Photoshop/Safari.
This is, of course, only an explanation for what's happening and still offers no new ideas on how to set things up to minimize the difference between non-color-managed Firefox/IE and Photoshop/Safari.
And the Gretag/Eye-One/Huey people are still staying mum, at least to me.
Maybe it doesn't make a good ad to say, "Now YOU can see the Internet in wonky color as never before! Go from suntan to sunburned without even stepping on a beach!"
"But don't worry, you can always go look over the shoulder of the accountant, who has just a regular Dell. The Internet looks natural on it because you didn't think to use the Eye-One for the QuickBooks machine."
BaldyRegistered Users, Super ModeratorsPosts: 2,853moderator
edited January 27, 2008
Well, I have news for owners of the Dell 30" monitors like we use at SmugMug. While info on the i1 is sparse on X-rite's site (they bought Gretag), one point they did make is how important it is to use the monitor's manual controls to get a good calibration, and they have a mode to assist in that. But our monitors only have brightness controls.
However, I read a review of the just-released Dell 30" and they said they learned from the previous version...the one we own. Now they've gone control-crazy with individual controls for almost everything and, importantly, an sRGB preset.
They said it took them time but with those controls they were able to dial in accurate color.
I saw other posts for the model we have and this was a typical response, from this page:
Paul, I would like to ask you do an experiement for me if you would.
1) Would you open a browser window of some website with some good photography.
2) Take a screen shot and while viewing the image at the website, view the screen shot in Photoshop side by side but make sure to have the color management settings on “North American Web / Internet” for Photoshop. This will ensure the PS image is displayed correctly in sRGB.
From what I have read about other high gamut monitors, none of them display colors outside of coloraware programs like Photoshop correctly. I want to see if this happening with Dell as well.
What happens with the other monitors is everything appears oversaturated. The reason is because the default profile for the monitor wide gamut but although sRGB is the default profile for Windows and the web, all images are displayed in the same wide gamut unless using a coloraware program. What this does is map the colors incorrectly resulting in appearing oversaturated. Right now with monitors only being 92% is the effect is some what subtle but has the gamuts rise to 100-120 next year I suspect at some point people are going stop thinking they are seeing new colors and realize something is wrong.
I'm feeling like we're getting down to it now. You?
However, I read a review of the just-released Dell 30" and they said they learned from the previous version...the one we own. Now they've gone control-crazy with individual controls for almost everything and, importantly, an sRGB preset.
Before you get too excited, please keep in mind what you're getting.
No CCFL LCD's have physical control over RGB, its a LUT. The LUT can be 8-bit at the graphic card (not useful) or with newer, better, more expensive units, internally in higher bit (more useful). The only LCD that allows true control over RGB are the two I know of that use LED backlight with three colored LEDs (Samsung and NEC).
Lots and lots of CCFL LCD's have all kinds of controls, many called RGB. You need to look at where and how the adjustments are being conducted. Messing around with 8-bit LUTs might give you the impression you've altered the color, the results are almost always a lot of banding on the display (a step forward, a step back).
When you view an image, its nice to know if banding is in the actual document or the result of the video system. The later is not useful nor desired.
What happens with the other monitors is everything appears oversaturated. The reason is because the default profile for the monitor wide gamut but although sRGB is the default profile for Windows and the web, all images are displayed in the same wide gamut unless using a coloraware program.
You've got to get past this idea that sRGB is somehow a default for stuff and right. Its not. Your display isn't produce sRGB. There's no default profile for the monitor that expects this (the so called default profile for the monitor should reflect the conditions of the monitor and be used in applications and the OS). I thought we made that pretty clear.
On a Mac, with my wide gamut display, using the myriad of ICC aware applications and OS, everything looks just fine (not under or over saturated). And we know why. Yes, a wide gamut display is farther from the elusive 'goal' of sRGB so on a dumb web browser, the saturation looks different than a lower gamut unit.
The bottom line is, if you don't have an ICC aware application, things are not going to match. If you're using a wide gamut display outside an ICC aware application, you have to ask yourself why you are doing this or why you spent the extra money for a wide gamut display.
Before you get too excited, please keep in mind what you're getting.
No CCFL LCD's have physical control over RGB, its a LUT. The LUT can be 8-bit at the graphic card (not useful) or with newer, better, more expensive units, internally in higher bit (more useful). The only LCD that allows true control over RGB are the two I know of that use LED backlight with three colored LEDs (Samsung and NEC).
Lots and lots of CCFL LCD's have all kinds of controls, many called RGB. You need to look at where and how the adjustments are being conducted. Messing around with 8-bit LUTs might give you the impression you've altered the color, the results are almost always a lot of banding on the display (a step forward, a step back).
Very interesting and useful description.
You've got to get past this idea that sRGB is somehow a default for stuff and right. Its not.
I can't, Andrew. It is the default for the web and right for 99.99% of the Internet's pages. Our customers don't want to surf the Internet and have color mismatches and oversaturated colors.
They want to see QuickTime movie trailers on Apple's site in the colors Spielberg intended.
You've got to get past this idea that sRGB is somehow a default for stuff and right. Its not. Your display isn't produce sRGB. There's no default profile for the monitor that expects this (the so called default profile for the monitor should reflect the conditions of the monitor and be used in applications and the OS). I thought we made that pretty clear.
On a Mac, with my wide gamut display, using the myriad of ICC aware applications and OS, everything looks just fine (not under or over saturated). And we know why. Yes, a wide gamut display is farther from the elusive 'goal' of sRGB so on a dumb web browser, the saturation looks different than a lower gamut unit.
The bottom line is, if you don't have an ICC aware application, things are not going to match. If you're using a wide gamut display outside an ICC aware application, you have to ask yourself why you are doing this or why you spent the extra money for a wide gamut display.
Andrew, we get your point. If you want accurate color, profile your display and use color-managed apps. You do not need to repeat that any more - we get it. That is not the issue we are working on.
Given that there are hundreds of thousands of Smugmug viewers (many are not customers and many are people that Smugmug has never spoken to and will never speak to), we are working on what is the best tactic to understand, mitigate or explain their experience if possible. We know it won't be accurate color, but we're working on understanding what things, if any, could be done to give them the best possible experience. That is the majority of the viewing public today and for awhile, so it seems like a worthwhile endeavor for a business that is trying to serve them. If you think that is pointless, then that's fine and you can bow out of the conversation and let us continue until we reach our own conclusion.
Maybe you're right (that there's no worthwhile mitigation of any kind that we can do), but we aren't done understanding our options yet and we're on our own journey to understand how all this stuff works. In the very least, we're gaining an understanding of why things are the way they are when they aren't color managed so we can at least explain it to folks who ask (and perhaps motivate them to get color managed).
So far, I see pretty good progress on our end:
Smugmug is now serving ICC profiles upon demand to ICC aware browsers.
We have a much better understanding for why these newer LCD monitors look more saturated when not color managed.
We understand the role of color temperature in the calibration/profiling process.
We understand the difference between calibration and profiling and now have a better understanding of what to expect when non color-managed.
We are starting to understand what Firefox 3 is implementing and how profiles might be served even more efficiently to that browser.
We're getting enough of an understanding that maybe we can even make some suggestions to the right people for future versions of a couple browsers.
I can't, Andrew. It is the default for the web and right for 99.99% of the Internet's pages. Our customers don't want to surf the Internet and have color mismatches and oversaturated colors.
Its not! If you understand that few people's displays, even when calibrated as you and Andy and others have tried, do NOT produce the same color appearance when viewing sRGB images as seen in Photoshop, how can you say everyone is seeing sRGB?
IF what you believe is true, then we'd all be seeing the same color appearance by simply uploading sRGB. Clearly that's not happening.
Upload an sRGB image or open an sRGB image on two different machines using two different displays in a non ICC aware application. If possible, do this on a Mac and a Windows box. Do they look identical? No. So how can sRGB be some kind of standard, correct assumption for a document that you KNOW is in sRGB?
You have to stop drinking this MS koolaid about sRGB. Yes, it is the closest color space for the zillion's of CRT displays out there (lack of calibration not withstanding) but they are not producing sRGB. If they did, we wouldn't be having this conversation.
Think of sRGB like the Government specified ratings for miles per gallon. YMMV (and it will).
I keep hoping you're getting it, only to be side tracked by this belief system about sRGB. Again, the only device that can really produce sRGB is a circa 1993 CRT with P22 phosphors. And even if we had 5 such CRTs, they change over time (the reason we frequently have to calibrate all displays). Or someone mucks around with the OSD controls of the display, bingo, the old sRGB behavior is no longer.
Andrew, we get your point. If you want accurate color, profile your display and use color-managed apps. You do not need to repeat that any more - we get it. That is not the issue we are working on.
I still wonder if everyone does 'get it' when they keep repeating this nonsense about sRGB. See above post.
The only way we can possibly get close to this sRGB behavior (display technology not withstanding) is using calibration even without the access to a profile.
I simply can't imagine why folks think that out of the box, all the display systems are producing sRGB based on what you've all done to test this so far. Then we have displays that are 3 years old, ones running 150cd/m2 differently than another, one costing $1800 with the other costing $300, or with all the mucking around people do to their displays to make them look "right". They are all producing sRGB? Not on your life.
The sRGB color space doesn't make a difference here. Its the closest color space we have to most displays (well certainly the slew of old CRTs). But we all know the same color numbers in sRGB uploaded to the web absolutely do not produce the same color appearance for all users. So how is sRGB useful here? Its the best target of which we are trying to hit on a very broad side of a barn. Hitting the barn and hitting bulls-eye are two very different things.
Of course, (for the last time) we know that we can produce the same color appearance from the same numbers from any color space, not just sRGB and we know by now what's required of users to get this. Otherwise, all bets are off.
Think of sRGB like the Government specified ratings for miles per gallon. YMMV (and it will).
A very good analogy. Those are actually just benchmarks, and are relevant only when compared to other vehicles MPG. Nothing more. Heck, this year they did a wholesale change in how they are calculated and went back to reatate older MPGs.
EPA tests are designed reflect "typical" driving conditions and driver behavior, but several factors can affect MPG significantly:
How & Where You Drive
Vehicle Condition & Maintenance
Fuel Variations
Vehicle Variations
Engine Break-In
Therefore, the EPA ratings are a useful tool for comparing the fuel economies of different vehicles but may not accurately predict the average MPG you will get. http://www.fueleconomy.gov/feg/why_differ.shtml
Hmmmm. Vehicle (monitor) variations, Engine (monitor) break-in,...
Think about it. How many people have you met who are shocked, if not downright PO'ed their mileage does vary from that window sticker?
You have to stop drinking this MS koolaid about sRGB. Yes, it is the closest color space for the zillion's of CRT displays out there (lack of calibration not withstanding) but they are not producing sRGB. If they did, we wouldn't be having this conversation.
Apparently we have HP to blame as much as MS. Such interesting reading on how they specifically dis ICC profiles being embedded:
Currently, the ICC has one means of tracking and ensuring that a color is correctly mapped from the input to the output color space. This is done by attaching a profile for the input color space to the image in question. This is appropriate for high end users. However, there are a broad range of users that do not require this level of flexibility and control. Additionally, most existing file formats do not, and may never support color profile embedding, and finally, there are a broad range of uses actually discourage people from appending any extra data to their files. A common standard RGB color space addresses these issues and is useful and necessary. We expect application developers and users that do not want the overhead of embedding profiles with documents or images to convert them to a common color space and store them in that format. Currently there is a plethora of RGB monitor color spaces attempting to fill this void with little guidance or attempts at standards. There is a need to merge the many standard and non-standard RGB monitor spaces into a single standard RGB color space. Such a standard could dramatically improve the color fidelity in the desktop environment.
{emphasis added}
It looks like the HP/MS chickens have come home to roost.
Besides changes in display technologies, today you have more people moving from our previously mentioned Group #3 to Groups #2 & #1. Heck, there are simply more users. And in 1993-1996 most people were using Lotus 1-2-3 and Word Perfect on their Windows 3.11 PCs, maybe Win 95, on Netware 3.11 networks at work. The only thing left from then? sRGB! (OK, I do know folks who still have WordPerfect and Netware - 4 of them)
Another observation: Besides applications, monitor mfg, calibration/profiling tools, what about the video card manufacturers?
"Don't ask me what I think of you, I might not give the answer that you want me to. Oh well."
-Fleetwood Mac
I have been reading and digesting all the great research and ideas.
On the original problem with the Dells I would point you to an article from Eizo about monitors with extended gamut. We use Eizo for our soft proofing stations.
I think the main problem here is a mismatch with your calibrator software and the total gamut of the monitor. Profiles created with a calibrator don't change a color only translate it to the output. If the curve mapping in the calibrator software does not match the total gamut of the monitor then the translation will not be accurate. I have noted some new calibrators that are designed to be used with HDTV's and I wonder how one of those would work on your Dell models.
I really enjoyed the comparison tests with the three up images on screen. We do this all the time with profiles and call it a round trip of the profile. We do it with printer profiles and not display but I believe the theory is the same.
I will include a link to gamutvision which can be used for this type of testing and will give you allot of the same numbers.
Discussion on white point and lack of direction from anyone on what is the best setting.
I think the biggest point here is what is your intent and setup for your monitor. If no color management is going on and out of the box you want to know where the white point should be I would say 6000-6500 because the average viewer thinks this looks good. White is white and at normal screen brightness levels most people would say 5000 is dark and warm or yellow. Now if you are using this monitor as a calibrated piece of equipment with print matching and controlled viewing boxes that is a different story. We use high quality viewing stations that are set to 5000 and as such we must have our white point set to match this in order to soft proof properly. If we did not do this then everything would look too warm comparing the prints in the viewing station and the screens.
We also match the brightness of the viewing stations to the brightness level of the screen so our density judgments are not thrown off.
I still am not sure if everyone who reads this understands sRGB and color spaces. It is not the easiest concept to get your arms around. For that matter profiles themselves are pure math in action and we all know how exciting that is to talk about at parties. I'll save this for another post if anyone is interested.
One thing that is most important with profiles is white point as it is from end to end in digital photography and old analog photography as well.
Something everyone might want to ponder is from ages gone by when a new film came out, say velvia as I think this was mentioned it had allot of color saturation more than allot of other films but did we say after looking at velvia that the other films were out of calibration? No, this was the look of Velvia and either you liked it or you didn't and maybe you used it for some projects and not for others. This also applies to digital and color spaces and calibrations. Remember that in the end all output devices have a set gamut and can only produce within that range. We have to interpret in the image what the real life color was and then translate that to something the output can reproduce perceptually pleasing and what our memories tell us it looked like. This is the function of profiles to squeeze all of that data into whatever space we need to get the job done.
Well probably I've created more questions than answers.
Carry on netCitizens
David Egolf CafePress Svcs.
1890 Beaver Ridge Circle
Norcross, GA 30071
W : 678-405-5500 ext 5620
C : 678-790-4553
0
BaldyRegistered Users, Super ModeratorsPosts: 2,853moderator
Given that there are hundreds of thousands of Smugmug viewers (many are not customers and many are people that Smugmug has never spoken to and will never speak to), we are working on what is the best tactic to understand, mitigate or explain their experience if possible. We know it won't be accurate color, but we're working on understanding what things, if any, could be done to give them the best possible experience.
5.4 million unique viewers/month according to Google analytics. 3.6 million uniques/month according to Quantcast, who can't measure them all.
John, you're a master at keeping the most important goals in focus.
I've gotten a few emails from the Apple guys saying my old boss would like to see an email from me with a really crisp suggestion for what to do about this mess we've created. As this thread points out so clearly, all of us have points of confusion despite how much time we've devoted to it. Imagine the plight of the consumer.
Here's a trial balloon for what we should tell Steve:
1. Let's ship the Mac with gamma set to 2.2. It's the Internet and TV standard. No Apple person is defending 1.8 anymore and the web is full of people suggesting 2.2, including Apple's own pages and the monitor calibration vendors.
I'll show him a Pixar clip on an iPhone, which seems to have a 2.2 gamma (can someone confirm?) and looks good, compared to Macs where his films look washed. That oughta cause his head to explode all over the room.
Spielberg and Lucas spend gobs finessing over getting their films just right for TV. Yet the owner of his own film studio is the guy most responsible for making their films looked washed.
(I'm intimately familiar with the decision to go 1.8 gamma, 'cus I worked for Steve in the days of dim monitors and color gurus who envisioned the web as a pre-press medium instead of the consumer medium it has become.)
2. Create a tag for color space info. Glomming a bunch of fatty ICC profiles on a page would be like Andrew handing out a chapter of his book instead of his business card to people he meets. Anyone who writes code and Steve have gotta say, "no brainer."
3. Default to sRGB for untagged elements. HTML and CSS only know for sRGB, so rendering them in something else is expressly rendering them in something the designer did not intend. Same for all Flash designed to this point, 99.999% of all jpegs, and pretty much all gifs and pngs.
If people want aRGB or ProPhoto, they can tag their images. That .1% of the population would know how.
One. Two. Three. Simple. Respects the consumer network the Internet is and the standard that artists create their work for, and yet lets people with wider gamut lust have wider gamut.
5.4 million unique viewers/month according to Google analytics. 3.6 million uniques/month according to Quantcast, who can't measure them all.
John, you're a master at keeping the most important goals in focus.
I've gotten a few emails from the Apple guys saying my old boss would like to see an email from me with a really crisp suggestion for what to do about this mess we've created. As this thread points out so clearly, all of us have points of confusion despite how much time we've devoted to it. Imagine the plight of the consumer.
Here's a trial balloon for what we should tell Steve:
1. Let's ship the Mac with gamma set to 2.2. It's the Internet and TV standard. No Apple person is defending 1.8 anymore and the web is full of people suggesting 2.2, including Apple's own pages and the monitor calibration vendors.
I'll show him a Pixar clip on an iPhone, which seems to have a 2.2 gamma (can someone confirm?) and looks good, compared to Macs where his films look washed. That oughta cause his head to explode all over the room.
Spielberg and Lucas spend gobs finessing over getting their films just right for TV. Yet the owner of his own film studio is the guy most responsible for making their films looked washed.
(I'm intimately familiar with the decision to go 1.8 gamma, 'cus I worked for Steve in the days of dim monitors and color gurus who envisioned the web as a pre-press medium instead of the consumer medium it has become.)
2. Create a tag for color space info. Glomming a bunch of fatty ICC profiles on a page would be like Andrew handing out a chapter of his book instead of his business card to people he meets. Everyone who writes code and Steve have gotta say, "no brainer."
3. Default to sRGB for untagged elements. HTML and CSS only know for sRGB, so rendering them in something else is expressly rendering them in something the designer did not intend. Same for all Flash designed to this point, 99.999% of all jpegs, and pretty much all gifs and pngs.
If people want aRGB or ProPhoto, they can tag their images. That .1% of the population would know how.
One. Two. Three. Simple. Respects the consumer network the Internet is and the standard that artists create their work for, and yet lets people with wider gamut lust have wider gamut.
No?
This sounds like a good summary of the asks.
If I understand what you're proposing and understand how tagged images interact with gamma settings, wouldn't item 2 will allow you to fix everything on Smugmug for people running the fixed version of Safari? And, isn't item #2 free of any tradeoffs at Apple or for customers (meaning there's no downside to it)? It has no backward compatibility issues. Only sites who chose to implement the new tags will get the new behavior. Since FF3 has already implemented some of these new tags, once you put the new tags in Smugmug, you'd be set for both new browsers.
I agree with items 1 and 3 as the "right" thing to do and if they'll do them that's great, but I'm sure there is a little heartburn or resistance about them. It may be so accepted that item 1 is wrong that they'll finally just fix the gamma issue, but I can certainly see how some existing things may have adapted to the 1.8 setting and those may seem broken on new macs once they switch.
Item 3 IS the right thing for the internet as a whole. While an untagged image is mystery meat (to borrow Andrew's phrase), an image on the open internet has ZERO chance of being in your own monitor profile. Your monitor profile is specific to your monitor or specific to the model and brand of monitor if you haven't profiled your display and have installed some default display profile. The image being displayed was put on the internet by someone who knows nothing about your system so it can't possibly be in your monitor profile. So assuming the monitor profile is probably the worst choice there is.
Assuming sRGB at least has a large chance at being right. The only time assuming sRGB would be worse than it is today is if the monitor profile is somehow corrupted and contains bad data. Then, using it to translate colors to your monitor would be worse than not using it at all (which is what Safari does today for an untagged image).
I can imagine that there is a bit of heartburn about changing item #3 because it won't match Flash until Flash also starts assuming untagged images are sRGB and implements color management.
So, the point of this posting was mostly to say that you might want to change the order of your asks. I think you can solve your problem for new versions of Safari if they implement #2 and it really has no downside at all for Apple. The other two items are also good things to do, but aren't required for you to solve the Smugmug issues in Safari and may be harder things for Apple to decide to deliver because they involve some short term tradeoffs.
I'm making an assumption that an image in Safari with an ICC profile displays properly whether the Mac gamma is set to 1.8 or 2.2. The monitor compensation that happens in color-managed Safari will accurately compensate for the different gamma settings. If I've got that wrong, then you would need item #1 too in order to get things working by default on new Macs.
BaldyRegistered Users, Super ModeratorsPosts: 2,853moderator
edited January 29, 2008
Good response, as always. It would be interesting to think through what the downside of switching to 2.2 gamma is.
Video is becoming very important to us and I know it is to Apple. If you save your production in various ways like with iMovie, Final Cut, or After Effects and play it various ways like with Flash or QuickTime, you find out in a hurry that it's a mess too, mostly due to non-standard gamma.
I get the feeling that the people in this thread mainly deal with stills. The Hollywood boys are always howling about the gamma and posting stuff like this in the forums:
for the 6 millionth time
your mac displays are 1.8 gamma- PCs and TVs are 2.2
To work in Video it is best to change the gamma on your computer display in System Preferences > Displays to correctly display TV gamma
gary adcock
Studio37
They don't want to hear about your mileage varying or profile fairy dust, they want fidelity with SMPTE 170M and its equivalents for HD.
Comments
It can but there's no guarantee.
Now things get interesting when I view my Flash galleries on a wide gamut display (93% of Adobe RGB (1998)). FireFox and Safari match. This probably indicates that the upload from LR doesn't contain a profile. They don't match LR there's a difference in saturation but its not a lot (which I wouldn't have expected). In fact, at least with some of the images I compared, I don't personally mind what I'm seeing on the web pages. A bit more Velvia look to them but nothing I mind. Hue seems to be well maintained, its saturation that is altered. I'm viewing the shot of the man washing his green boat. Adobe RGB (1998) is larger in green primary gamut than sRGB (that's its biggest difference). Looks fine to me. This display is calibrated far from what we'd be calling "sRGB" both in terms of its gamut, luminance etc.
The 23" Cinema display, calibrated and profiled to native gamma and white point looks identical in the three applications. This may be a good point to calibrate such displays to (native/native) where no LUT is being affected.
Unfortunately, not all calibration packages provide this.
Author "Color Management for Photographers"
http://www.digitaldog.net/
Match what? Each other or Lightroom?
They *should* match each other regardless of monitor because neither one is being color managed.
http://tinyurl.com/ysbfuu
We have an (original) LAB image, sRGB and Adobe RGB (1998) image with lots of colors.
I see a difference between Firefox and Safari (again, its saturation). Safari matches LR previews on a standard sRGB calibrated and profile display and on a wide gamut display. FireFox doesn't on either.
Note on viewing. Ideally when comparing images, you should do so at 100% (1:1), especially when dealing with Photoshop. Not really possible here due to the size of the web images. But at the very least, attempt to size the comparisons to the same size on screen. Or download the full rez images and resize to match the web.
The full rez image in Lab can be found at:
http://homepage.mac.com/billatkinson/FileSharing2.html
It's 51mb!
The two RGB images are in a folder on my public iDisk called RGB Test Images and are about 12mb:
My public iDisk:
thedigitaldog
Name (lower case) public
Password (lower case) public
Public folder Password is "public" (note the first letter is NOT capitalized).
To go there via a web browser, use this URL:
http://idisk.mac.com/thedigitaldog-Public
Author "Color Management for Photographers"
http://www.digitaldog.net/
Browser's match each other with Flash, don't with HTML.
Author "Color Management for Photographers"
http://www.digitaldog.net/
I did a bunch of measurements today to see how my monitor behaves in Firefox, Safari and Photoshop with a bunch of different monitor profiles. These are the things I was hoping to gather some info on:
- How do Safari, Firefox and Photoshop compare with different calibration profiles?
- How does the color temperature setting in the calibration process affect the colors that are displayed in each app
- Can I learn anything that might help mitigate the difference between Firefox and Photoshop?
- Which color temperature setting is the best for my monitor if my objective is to see accurate color (not a representation of printed output)?
Summary of procedure- Get a color chart (once with somewhat standard blocks of color on it)
- Make sure it's in sRGB and tagged that way
- Resize the image so three copies of it fit at 100% on my screen
- Do a particular color calibration of my screen (details below for which ones I did)
- Load the image into each app
- Position all three apps so I can see all three at the same time
- Use my camera as a poor mans color measuring tool by taking a picture of the screen
- Load that picture into Photoshop and use the eye dropper to read the color values off 6 different color swatches in the image
- Record all those values in both sRGB and LAB (using the info palette in Photoshop)
- Calibrate the screen for the next color temperature
- Reload all the apps
- Take picture, repeat process for each color calibration
- Put all results into a spreadsheet
- Decide what conclusions I can draw from the data
- Write up results
ConclusionsHere are some of the conclusions I can confirm:
- The detailed color measurements are in this table if you want to see them yourself. I personally find it easier to look at the LAB numbers.
- The color temperature setting does indeed affect how a calibrated, profiled screen looks in Photoshop so this color temperature number is very important if you want to see accurate colors
- Safari and Photoshop display very similar colors when viewing this ICC profile tagged image (easily within the margin of error of my measurements)
- Firefox varies signficantly from Safari and Photoshop (no surprise here)
- Firefox displays quite differently after calibrating/profiling than before and how it displays depends a lot on the color temperature number you pick for the calibration process. Since Firefox is not using my monitor profile for monitor compensation (like color-managed apps use), that means the Eye One Display 2 must be changing the default monitor display (probably with LUT values) according to the color temperature setting.
- I saw some small luminosity fluctuations between swatches that were otherwise nearly the same color. I don't know exactly why that is, but it could be explained either by slight fluctuations in the backlighting on my monitor or by some light fall-off with my lens.
- With this set up, I am unable to assess absolute color accuracy, but I can measure differences between the three different apps. This is because of unknown white balance settings and uncalibrated RAW converter color interpretation. But, the measurements are consistent so they can be used to look at color differences between scenarios, just not absolute color accuracy.
- I see the smallest difference between the Firefox colors and the Photoshop colors in either the "native" calibration of the 6000 degree color temperature calibration, though there are still significant differences. The whites/grays are pretty close in Firefox at this temperature, but most of the other colors are significantly different.
- There's no good setting that can make skin colors in Firefox get close to Photoshop. I picked the pink and peach color swatches because they were in the same neighborhood of skin color, but they are always different - consistently showing similar luminosity, but higher numbers in both A and B channels.
- Looking at the 6000 degree calibration as the one I will probably use, we see that in all swatches measured, Firefox shows A and B values further from zero (more positive and more negative). It's as if, the monitor has "cranked up the color" and the calibrated view through Photoshop has brought it back to reality. This might explain why some people like the Firefox view better than the Photoshop view. It appears to have richer colors.
- "Native" for my monitor is almost identical to the 6000 degree calibration. I assume this is monitor dependent.
Details of the experimentMonitor: HP LP3065 (30" LCD).
Windows OS, Vista 32-bit Home Premium
Gretagmacbeth EyeOne Display 2 USB calibrator puck and software
NVIDIA GeForce 8600 GTS video card
In more detail, here's what I did. I started with an image that has a color chart in it. Andrew has one linked from his site on this page, so I started with that one. It was in the colormatch colorspace so I converted it to sRGB and then downsized it so three simultaneous versions of it would fit on my screen at 100%. Here's what a small version of the image looks like:
While there are lots of colors on this that I can look at to subjectively evaluate, I decided to focus only on the solid colors in the color patch below the photo of the lady as those are easiest to consistently measure with the eye dropper and info palette in Photoshop.
Then, I set a particular color calibration on my screen. For the "uncalibrated view", I used the HP monitor profile that comes with my monitor which is presumably some factor supplied standard monitor profile. For the other color temperatures, I did a full screen calibration using the Eye One Display2 system that I have. I tried to place the puck in the same part of the screen for each calibration. I made the room as dark as possible and even shielded the screen from any stray light through the blinds. I tested all of the following color temperature settings in my color calibration software:
- Native
- 5000 (fairly warm)
- 5500
- 6000
- 6500 (cooler)
Then, after setting a particular color calibration, I loaded the test image into each of the three apps. I positioned the apps on the screen so that I could see all three at once.With my Nikon D2xs and 105mm macro lens on a tripod, set for manual exposure (so the exposure would be exactly the same for all photos), set for fixed sunlight white balance (so we'd have a consistent WB setting for all photos), I took a picture of the screen in RAW.
I noticed that if I focused on the screen to make the image sharp, I would get a lot of moire. This is likely because the pitch of the pixels on the screen is somewhat close to the pixel pitch on my sensor at the distance I was at. I figured the moire would be a really bad thing for consistent color measurement so I decided to just defocus the image slightly. This makes the moire go away (because it blurs the screen pixels to a larger size so they no longer match the sensor pixel pitch) and, as long as the blur isn't too bad, it shouldn't really affect a color swatch measurement in the center of the swatch. I thought about doing the blur in post processing, but decided it was better to just never have moire in the image in the first place.
Here's what one of those pictures looks like (purposely slightly out of focus). Firefox is on the left, Safari in the middle and Photoshop on the right:
I repeated this process for each of the different color calibrations, so that I ended up with 7 test images, each with a different monitor profile. The resulting test images after converting to JPEG are all here.
I then loaded the RAW file into ACR, turned off all auto adjustments so all images got the exact same settings in ACR and opened them one at a time in Photoshop. In Photoshop, I took color readings using the eye dropper from the image in Photoshop off six different color swatches for each app and I recorded the numbers in both RGB and in LAB (just using the info palette) in a big table. That's 18 color measurements, recorded in both RGB and LAB, for each image. I realized that the LAB values would be useful for seeing differences in luminosity separately from color.
Data
Here are all the color measurements: http://jfriend.smugmug.com/gallery/4224024.
Here are the screen photos after default conversion to JPEG with no auto settings: http://jfriend.smugmug.com/gallery/4224315.
Discussion
By looking at the numbers in the table, I can see that the 6000 degree numbers are very close to the native numbers. I think that means that my monitor is natively around 6000 degrees.
In this test, it is possible to get the neutrals in Firefox and Photoshop to line up pretty well (~6000 degrees), but even when they line up, the other colors are off. I think this proves that the monitor does indeed need calibration and the calibration that it needs is not as simple as uniform color shift.
There are so many numbers in this table that I was trying to figure out if there was some graphical way to represent it to perhaps see some sort of trend that it's hard to see in the numbers, but I didn't come up with any immediate ideas for what to show. I also made the mistake of entering each color triplet into a single cell in the spreadsheet so it would take some busy work to change the numbers so they could be graphed. I did wonder about trying to quantify the color differences between Firefox and Photoshop with some sort of formula based on the differences between the color patch measurements, but I did try it yet.
In every single case I looked at, the Firefox numbers were "more saturated" than the Photoshop numbers. I'm guessing that's because more saturated monitors sell more so they crank it up in the design of the monitor. The monitor compensation that Photoshop does actually serves to "dull" the colors a bit. I believe it's probably doing the accurate thing, but I can now see why some people say they prefer the Firefox version of the image over the Photoshop version. How much this happens is presumably monitor-dependent, but I wouldn't be surprised if it's a common industry norm.
The LAB numbers are a lot easier for me to draw conclusions from because you can easily see luminosity separate from the A and B color channels. If you aren't familiar with the LAB color space, then it's probably worth a quick read of this Wikipedia article and there are many other articles on the web if you search on Google.
Comments? Questions? Any other conclusions you can draw from the numbers?
Homepage • Popular
JFriend's javascript customizations • Secrets for getting fast answers on Dgrin
Always include a link to your site when posting a question
6000K may be ideal for your display. Don't know if that's true for every display. The luminance will likely alter this sweet spot too. And I'd refrain from using kelvin from calibration and instead focus on the standard illuminants as at least you're taking about a single color.
Unless I missed something, I'm not sure how you got the Lab values (did you measure with the colorimeter?).
The native white point of various displays might be all over the map.
Author "Color Management for Photographers"
http://www.digitaldog.net/
Very impressive.
I am going to spend some quality time with this over the weekend (along with the other links you referenced), but for the moment I've been swept into the world of video formats.
John, you realize we have a beeg problem to face in the next few days, and it's your fault (well, you and Andrew)? We're converting our built-in slideshow from Javascript-based to Flash by popular demand. So what's gonna happen is you'll be browsing your galleries in Safari and because we now attach profiles to all images (unlike, for example, the International Color Consortium, Apple and Adobe) you'll see your photos in all their glory (unlike, for example, with Andrew's Flash-based galleries).
And then you'll click the slideshow button and presto, the color will shift because now your photos are displayed in Flash, which doesn't know for color management.
And our customers are not gonna blame Adobe, Apple, John and Andrew... The shrill emails coming to our support heroes are gonna say, "SmugMug wrecks my photos!" :cry :cry :cry
His new user title should be "6000 Degrees of Calibration"
Dgrin FAQ | Me | Workshops
I was hoping to uncover more useful conclusions in the testing, but at least I've verified a bunch of things that were talked about and I think I understand what's going on much better now. The two big misunderstandings that I've run though in the last 6 months are:
1) Calibrating your screen doesn't do much calibrating at all. It mostly profiles and it's up to a color-managed app to use the profile to make accurate colors. So, calibrating doesn't really help Firefox/IE at all. Ask 20 people who bought a color calibrator in the last year and I bet 19.99 of the 20 think that their screen will display more accurate color even in Firefox and IE after calibrating it.
2) These calibration/profiling tools have a design center around matching a printed page and that's why they make you pick a color temperature. They do not have a design center around giving you accurate emissive color so your screen will look the same as mine when we both use color-managed tools. If we happen to coordinate and pick the same color temperature, they should come close, but if we don't or don't even know each other, they won't match. I hope this industry offers tools for "web viewing" that are centered around color accuracy for web viewing rather than color matching to a print in your specific lighting. That's probably the larger market going forward as the market for color calibration goes mainstream.
On all the color issues, The big lesson to me here is that for the majority of the public (which is not very educated about color stuff), color differences between the same image displayed two different ways are way, way more noticeable than color accuracy issues. Thus a difference between Photoshop and your browser is way more visible than both being the same, but wrong. When you're striving for color accuracy and the tools your viewers are using to view the web are only half way there, what a pain! I feel for the predicament. I guess you'll just have to take a few scars for heading this way now and the tools will catch up over time. The only other choice is to wait until the entire installed base of tools gets upgraded over time and that would be a long ways out, so you've probably got to get in before everything is there anyway. I feel a big color FAQ coming.
Any idea what the ETA is for color-management in Flash? I've heard that it's coming sometime and I can certainly see how it's in Adobe's interest to do so, but I've not heard when.
Homepage • Popular
JFriend's javascript customizations • Secrets for getting fast answers on Dgrin
Always include a link to your site when posting a question
I'm not sure I understand your twin statements about FF displaying quite differently after calibrating/profiling and it not using the monitor profile. Can you explain? I'm assuming you mean FF doesn't but the OS does. Else why would the net result be different?
Are we discounting Gary Ballard's views? I ask because the Aperture team and Apple's pages reference him on these issues, and his views and mine are very close (but presented with compelling data, I'll change).
His pages are long and multi-topic so I'll extract the key points to make this easier:
"Obviously there is nothing we can do about all the computers in the real world that use bad monitor profiles on uncalibrated systems, but good 2.2 gamma, 6500/D65 monitors will display sRGB with the most faithful appearance with the best, truest color possible."
"If untagged sRGB displays incorrect color in Safari or FireFox, I would say you have a bad monitor profile or bad operating system, software bug or odd approach to monitor profiling."
"If Photoshop working color is set to sRGB and the monitor profile was created for 2.2 gamma and 6500K, color shifts in Save For Web result from a bad monitor profile."
The catalyst for this thread was that Andy and Steve calibrated/profiled their monitors and after doing so untagged sRGB images looked worse (oversaturated). It's hard for me to understand how that sacrifice has to be made to make Photoshop accurate. Photoshop has nearly unlimited latitude to change the image.
If Eye-One's shoddy code or cheap spectrometer is creating a display profile that doesn't cause the viewer to see accurate color, what's to stop us from using the photo spectrometer I own, displaying a digital Gretag color chart of known RGB values, and writing our own profiles? Then we can check the thing we haven't checked yet: are we creating accurate calibration/profiles?
Here we go again.
ICC aware uses profile to preview. Non ICC doesn't. What you should be assuming is FF doesn't use profile, Safari does, not the OS.
The assumption is, if you calibrate a display, that's a fixed preview for all applications but of course its not. Calibration affects all applications but ICC profile compensation only affects some. So its perfectly normal that the two don't match.
Author "Color Management for Photographers"
http://www.digitaldog.net/
Let me try to explain. When you calibrate your display, there are potentially two things going on (and I see both of these happening in the measurements of my system).
1) The default display colors are changed.
2) The resulting screen is then measured and a monitor profile which describes it's behavior is created and then put in a well known system location.
There are a couple of ways that the first item is accomplished depending upon your technology. The way that I understand is that a "look-up table" often called a LUT is created and given to the video driver. I don't understand the mechanics of this, but it happens at the OS/driver level. This will change the colors the monitor produces when fed a given RGB value. That means that non-color-managed apps like Firefox are directly affected by this. Unfortunately, these LUT changes aren't really fine calibration settings. They don't successfully "calibrate" your monitor for non-color-managed apps. I don't really know this for sure, but I think all they are trying to do is to get your monitor in the right ballpark so the the corrections specified in the monitor profile are more managable and not too far off.
Then, once these LUT values are put into place, the resulting screen is measured for accuracy and those results are recorded into the monitor profile. You can think of the monitor profile as an "color error map" as it can be used to understand the existing color display errors in the default display. A color-managed app then grabs the monitor profile and uses that to figure out how it must change the numbers it sends to the screen in order to get the actual color it wants.
Now, to your questions.
The LUT values set in item #1 above cause the screen to change what color it displays for a given RGB value. This means that non-color-managed apps like Firefox will have their colors affected with the LUT values. You can see in the actual Firefox color measurements I took that indeed, it displays different colors for each color calibration even though it's not using the monitor profile. Since Firefox is not color-managed, it isn't doing anything differently across these different calibrations/monitor profiles. It's the LUT values that are set differently and causing the display to show different colors.
We are all not finding this true in practice. The monitor's Andy calibrated didn't show this to be true and no calibration I did on mine (and I've tried it a whole bunch of different ways) found this to be true. Mine's not far off, but skin-tones are noticably different. For this to be true, one of two things would have to be true.
1) Either the native monitor color would have to be very close to sRGB.
or
2) The LUT calibration would have to be able to make the monitor produce very good sRGB such that when it's fed a given RGB value (anywhere in the color spectrum), it produces the right sRGB color on the screen.
The color numbers in my measurements show this not to be the case for my configuration. My monitor produces oversaturated colors when uncalibrated and at all calibration temperatures. I don't know if it's possible to use the LUT to make the default display match sRGB better or not, but my software is not doing it.
That's what I thought would be the case when I first started calibrating my system, but practice doesn't show that to be true. As in the previous section, I'd like to know if it's possible for the LUT calibration changes to make the default display more like sRGB or not, but regardless my Eye One Display 2 software is not doing that. This is not theory here, just actual measurements.
Same argument as the previous two sections. This assumes that the default display has been "calibrated" to be close to sRGB either because it came that way out of the box or with the LUT changes. None of ours are that way after calibration. I wish it was true, but in practice we aren't find it to be true.
The default display (what non-color-managed apps display) is clearly being changed when we calibrate out displays. You can see that from the color numbers in my measurements. Just look at the Firefox color values for the Peach color. They are significantly different at each color temperature and all different from the unprofiled setting. My screens is oversatured by default and it stays that way at each calibration setting. I don't know whether I can conclude that the Firefox view looks "worse" after calibration or not. I know that it's different and no more accurate than it was before. It seems like it would be possible to have calibration software that did no LUT management at all and thus didn't change your default display at all, so Firefox was completely unaffected by the profiling process, yet still created a completely accurate monitor profile so color managed apps could apply the right correction and display accurate color. This should be theoretically possible if the monitor wasn't natively too far off. If we could get the right software engineer who knows how the Eye One Display 2 software works in a room for two hours, we could get those answers.
As I hope I explained above, the monitor profile doesn't change the screen for Firefox at all. It's the LUT values in the video driver that do that. With different software design goals, it might be possible to create calibration software that would set the LUT values to make the default display get as close to sRGB as possible so that non-color-managed apps had much more accurate color. I don't know for sure. I'll do some searching on LUT calibration and see if I can learn anything. One thing I would expect is that if you tried to make LUT calibration produce sRGB, you might be squashing some of the colors that the monitor can produce. Certainly if this was done for a monitor capable of producing most of AdobeRGB, you'd be ruining that with a LUT that squished it down to sRGB.
Homepage • Popular
JFriend's javascript customizations • Secrets for getting fast answers on Dgrin
Always include a link to your site when posting a question
This illustrates a number of critical points:
1. Calibration alone isn’t enough. That's WHY we have the ICC architecture we do where profiles are necessary for the Display Using Monitor Compensation process. We've seen this historically. The ONLY displays I've ever seen that can be calibrated to visually identical spec's are true reference displays. Radius Pressviews calibrated to ColorMatch RGB, Barco Reference V etc. We're talking (in 1990 dollars) units that cost $3-5K.
2. No amount of calibration makes a unit exactly into an sRGB device. Lets recall that sRGB is a theoretical color space based on a specific device but not necessarily on a real device. Its probably possible to take an old CRT display with P22 phosphors and place it into the surround specified as sRGB (the ambient light conditions in terms of intensity and color) and make it mimic very closely sRGB. Your modern LCD isn't going to do this. Its been designed to get "close" and it takes a lot going on internally for this to happen. We must have a profile that defines the current device behavior and an ICC aware application to produce the visual effect of sRGB on differing units.
Bottom line is, no matter how you calibrate that modern display, it might get close but not exactly produce sRGB. Just look at the rather huge differences in the light source of the two emissive technologies and I think you'll see how difficult this is. Also look at the huge differences in the maximum lumanice of even a new CRT (maybe 100cd/m2) and a new LCD (many can easily hit 300). The sRGB spec had no idea these devices would exist when designed (again solely from simple math).
The only way to get multiple dissimilar displays to produce the same color appearance is to fingerprint their conditions using a profile and let the profile compensate on the fly in the application. Or expect every user to have a true reference display. But getting close outside ICC aware applications isn't necessarily a bad idea UNLESS you've got to screw around with the 8-bit LUTs to the degree you move very far from the native condition, then the result is more banding on-screen. Newer units with high bit internal adjustments over come this. But then, ask your self if forcing a non sRGB behaving display into an sRGB behavior is a good idea? Usually not. And if you're talking about newer units that exceed sRGB gamut, forcing them into sRGB is even more pointless.
We need to move away from sRGB as some kind of useful, well defined standard. Its not 1993! And even more shocking is that of all the digital imaging technologies since then (cameras, scanners, processors, software), the one technology that hasn't moved anywhere as quickly is display technology. That's changing (slowly). sRGB is a dinosaur.
Author "Color Management for Photographers"
http://www.digitaldog.net/
OK, a little more info on the LUT. The LUT values can be stored in the monitor profile, but are acted upon by external software (either in the OS or something that runs at startup) and applied to the video card. One source I've read suggests that the LUT values are used by most software to get the monitor to a proper gamma and not used to calibrate the color.
Here's a thread from earlier this year where Andrew share's his view on LUT manipulations.
I've found several other discussions that say that you get the best color reproduction in color-managed apps if the LUT value changes are minimized. This happens if you set your calibrator for native color temperature.
If you start manipulating color in the LUT too much, you start dminishing the colors that can be produced on the monitor and this can also lead to banding (if you lose the ability to produce certain colors on your monitor). Part of this is because most LUTs are 8-bit and multiple 8-bit manipulations on an 8-bit display starts to cause rounding errors which leads to errors in the lowest bits which manifests itself as banding.
I've also found a whole slew of articles that say that Windows Vista has all sorts of problems with losing the LUT values once they've been set. Here's one particular article. Apparently they can get lost after your screen times out and gets turned off (if you have it set that way) and whenever Vista dims your screen to ask for permission to do something. What a mess! Maybe this will get fixed in the Vista update due this quarter.
Homepage • Popular
JFriend's javascript customizations • Secrets for getting fast answers on Dgrin
Always include a link to your site when posting a question
Chris,
I don’t know anything about the rocket science, but the short answer is the ‘problem’ monitor hardware displays higher gamun that the sRGB space, hence the saturation boost when the monitor profile is applied/assigned/assumed to sRGB.
The issue is very easy to nail in Photoshop:
1) Convert to sRGB in Photoshop
2) go to View> Proof SetUp: Monitor RGB
That should show the saturation boost you are seeing exactly as untagged sRGB will display in Mac web browsers.
I think calling the monitor manufacturer and the hardware calibrator tech supports and asking them WHY this is happening is more time productive…this exact problem was beat to death in the Photoshop forum recently http://www.adobeforums.com/webx?14@@.3c053066/0
G BALLARD
Do you understand what View>Proof Setup: Monitor RGB does? Cause it doesn't solve any of the issues we've discussed.
This setting tells Photoshop to use it's ICC aware capabilities to view an sRGB (in this case) as if Photoshop were a dumb web browser. Its doing an sRGB to Monitor RGB (YOUR specific monitor) conversion. The only thing this soft proof setting is useful for is showing you how the sRGB doc would look if you viewed it on YOUR machine in a non ICC aware application on your machine. But you could do this by simply opening the sRGB document in IE on your machine. But Photoshop wants to provide a soft proof of this for you while you're in Photoshop.
No one but you will see this preview. Photoshop has no way to understand what any other user's display is doing (certainly without that display and a profile for it). So you're back to square one.
Author "Color Management for Photographers"
http://www.digitaldog.net/
Two things could explain why this problem seems to have gotten more prominent lately. First, LCDs have taken over the world and most of them really can't be color calibrated much at all (they get profiled for color-managed apps). CRTs could get calibrated.
Second, some of the newest generation of monitors is a wider gamut than the oder generations which, Gary explains, causes more of a mismatch when the monitor is assumed to be sRGB (because the monitor is actually further from sRGB than older ones).
This is, of course, only an explanation for what's happening and still offers no new ideas on how to set things up to minimize the difference between non-color-managed Firefox/IE and Photoshop/Safari.
Homepage • Popular
JFriend's javascript customizations • Secrets for getting fast answers on Dgrin
Always include a link to your site when posting a question
Maybe it doesn't make a good ad to say, "Now YOU can see the Internet in wonky color as never before! Go from suntan to sunburned without even stepping on a beach!"
"But don't worry, you can always go look over the shoulder of the accountant, who has just a regular Dell. The Internet looks natural on it because you didn't think to use the Eye-One for the QuickBooks machine."
However, I read a review of the just-released Dell 30" and they said they learned from the previous version...the one we own. Now they've gone control-crazy with individual controls for almost everything and, importantly, an sRGB preset.
Behold:
http://www.hothardware.com/articles/Dell_UltraSharp_3008WFP_30inch_LCD_With_DisplayPort/?page=4
They said it took them time but with those controls they were able to dial in accurate color.
I saw other posts for the model we have and this was a typical response, from this page:
Paul, I would like to ask you do an experiement for me if you would.
1) Would you open a browser window of some website with some good photography.
2) Take a screen shot and while viewing the image at the website, view the screen shot in Photoshop side by side but make sure to have the color management settings on “North American Web / Internet” for Photoshop. This will ensure the PS image is displayed correctly in sRGB.
From what I have read about other high gamut monitors, none of them display colors outside of coloraware programs like Photoshop correctly. I want to see if this happening with Dell as well.
What happens with the other monitors is everything appears oversaturated. The reason is because the default profile for the monitor wide gamut but although sRGB is the default profile for Windows and the web, all images are displayed in the same wide gamut unless using a coloraware program. What this does is map the colors incorrectly resulting in appearing oversaturated. Right now with monitors only being 92% is the effect is some what subtle but has the gamuts rise to 100-120 next year I suspect at some point people are going stop thinking they are seeing new colors and realize something is wrong.
I'm feeling like we're getting down to it now. You?
Before you get too excited, please keep in mind what you're getting.
No CCFL LCD's have physical control over RGB, its a LUT. The LUT can be 8-bit at the graphic card (not useful) or with newer, better, more expensive units, internally in higher bit (more useful). The only LCD that allows true control over RGB are the two I know of that use LED backlight with three colored LEDs (Samsung and NEC).
Lots and lots of CCFL LCD's have all kinds of controls, many called RGB. You need to look at where and how the adjustments are being conducted. Messing around with 8-bit LUTs might give you the impression you've altered the color, the results are almost always a lot of banding on the display (a step forward, a step back).
When you view an image, its nice to know if banding is in the actual document or the result of the video system. The later is not useful nor desired.
You've got to get past this idea that sRGB is somehow a default for stuff and right. Its not. Your display isn't produce sRGB. There's no default profile for the monitor that expects this (the so called default profile for the monitor should reflect the conditions of the monitor and be used in applications and the OS). I thought we made that pretty clear.
On a Mac, with my wide gamut display, using the myriad of ICC aware applications and OS, everything looks just fine (not under or over saturated). And we know why. Yes, a wide gamut display is farther from the elusive 'goal' of sRGB so on a dumb web browser, the saturation looks different than a lower gamut unit.
The bottom line is, if you don't have an ICC aware application, things are not going to match. If you're using a wide gamut display outside an ICC aware application, you have to ask yourself why you are doing this or why you spent the extra money for a wide gamut display.
Author "Color Management for Photographers"
http://www.digitaldog.net/
They want to see QuickTime movie trailers on Apple's site in the colors Spielberg intended.
Andrew, we get your point. If you want accurate color, profile your display and use color-managed apps. You do not need to repeat that any more - we get it. That is not the issue we are working on.
Given that there are hundreds of thousands of Smugmug viewers (many are not customers and many are people that Smugmug has never spoken to and will never speak to), we are working on what is the best tactic to understand, mitigate or explain their experience if possible. We know it won't be accurate color, but we're working on understanding what things, if any, could be done to give them the best possible experience. That is the majority of the viewing public today and for awhile, so it seems like a worthwhile endeavor for a business that is trying to serve them. If you think that is pointless, then that's fine and you can bow out of the conversation and let us continue until we reach our own conclusion.
Maybe you're right (that there's no worthwhile mitigation of any kind that we can do), but we aren't done understanding our options yet and we're on our own journey to understand how all this stuff works. In the very least, we're gaining an understanding of why things are the way they are when they aren't color managed so we can at least explain it to folks who ask (and perhaps motivate them to get color managed).
So far, I see pretty good progress on our end:
Homepage • Popular
JFriend's javascript customizations • Secrets for getting fast answers on Dgrin
Always include a link to your site when posting a question
Its not! If you understand that few people's displays, even when calibrated as you and Andy and others have tried, do NOT produce the same color appearance when viewing sRGB images as seen in Photoshop, how can you say everyone is seeing sRGB?
IF what you believe is true, then we'd all be seeing the same color appearance by simply uploading sRGB. Clearly that's not happening.
Upload an sRGB image or open an sRGB image on two different machines using two different displays in a non ICC aware application. If possible, do this on a Mac and a Windows box. Do they look identical? No. So how can sRGB be some kind of standard, correct assumption for a document that you KNOW is in sRGB?
You have to stop drinking this MS koolaid about sRGB. Yes, it is the closest color space for the zillion's of CRT displays out there (lack of calibration not withstanding) but they are not producing sRGB. If they did, we wouldn't be having this conversation.
Think of sRGB like the Government specified ratings for miles per gallon. YMMV (and it will).
I keep hoping you're getting it, only to be side tracked by this belief system about sRGB. Again, the only device that can really produce sRGB is a circa 1993 CRT with P22 phosphors. And even if we had 5 such CRTs, they change over time (the reason we frequently have to calibrate all displays). Or someone mucks around with the OSD controls of the display, bingo, the old sRGB behavior is no longer.
Author "Color Management for Photographers"
http://www.digitaldog.net/
I still wonder if everyone does 'get it' when they keep repeating this nonsense about sRGB. See above post.
The only way we can possibly get close to this sRGB behavior (display technology not withstanding) is using calibration even without the access to a profile.
I simply can't imagine why folks think that out of the box, all the display systems are producing sRGB based on what you've all done to test this so far. Then we have displays that are 3 years old, ones running 150cd/m2 differently than another, one costing $1800 with the other costing $300, or with all the mucking around people do to their displays to make them look "right". They are all producing sRGB? Not on your life.
The sRGB color space doesn't make a difference here. Its the closest color space we have to most displays (well certainly the slew of old CRTs). But we all know the same color numbers in sRGB uploaded to the web absolutely do not produce the same color appearance for all users. So how is sRGB useful here? Its the best target of which we are trying to hit on a very broad side of a barn. Hitting the barn and hitting bulls-eye are two very different things.
Of course, (for the last time) we know that we can produce the same color appearance from the same numbers from any color space, not just sRGB and we know by now what's required of users to get this. Otherwise, all bets are off.
Author "Color Management for Photographers"
http://www.digitaldog.net/
A very good analogy. Those are actually just benchmarks, and are relevant only when compared to other vehicles MPG. Nothing more. Heck, this year they did a wholesale change in how they are calculated and went back to reatate older MPGs.
- How & Where You Drive
- Vehicle Condition & Maintenance
- Fuel Variations
- Vehicle Variations
- Engine Break-In
Therefore, the EPA ratings are a useful tool for comparing the fuel economies of different vehicles but may not accurately predict the average MPG you will get. http://www.fueleconomy.gov/feg/why_differ.shtmlThink about it. How many people have you met who are shocked, if not downright PO'ed their mileage does vary from that window sticker?
Apparently we have HP to blame as much as MS. Such interesting reading on how they specifically dis ICC profiles being embedded:
We expect application developers and users that do not want the overhead of embedding profiles with documents or images to convert them to a common color space and store them in that format. Currently there is a plethora of RGB monitor color spaces attempting to fill this void with little guidance or attempts at standards. There is a need to merge the many standard and non-standard RGB monitor spaces into a single standard RGB color space. Such a standard could dramatically improve the color fidelity in the desktop environment.
{emphasis added}
Besides changes in display technologies, today you have more people moving from our previously mentioned Group #3 to Groups #2 & #1. Heck, there are simply more users. And in 1993-1996 most people were using Lotus 1-2-3 and Word Perfect on their Windows 3.11 PCs, maybe Win 95, on Netware 3.11 networks at work. The only thing left from then? sRGB! (OK, I do know folks who still have WordPerfect and Netware - 4 of them)
Another observation: Besides applications, monitor mfg, calibration/profiling tools, what about the video card manufacturers?
-Fleetwood Mac
I have been reading and digesting all the great research and ideas.
On the original problem with the Dells I would point you to an article from Eizo about monitors with extended gamut. We use Eizo for our soft proofing stations.
http://www.eizo.com/support/wp/pdf/wp_06-001A.pdf
I think the main problem here is a mismatch with your calibrator software and the total gamut of the monitor. Profiles created with a calibrator don't change a color only translate it to the output. If the curve mapping in the calibrator software does not match the total gamut of the monitor then the translation will not be accurate. I have noted some new calibrators that are designed to be used with HDTV's and I wonder how one of those would work on your Dell models.
I really enjoyed the comparison tests with the three up images on screen. We do this all the time with profiles and call it a round trip of the profile. We do it with printer profiles and not display but I believe the theory is the same.
I will include a link to gamutvision which can be used for this type of testing and will give you allot of the same numbers.
http://www.gamutvision.com/docs/roundtrip.html
Discussion on white point and lack of direction from anyone on what is the best setting.
I think the biggest point here is what is your intent and setup for your monitor. If no color management is going on and out of the box you want to know where the white point should be I would say 6000-6500 because the average viewer thinks this looks good. White is white and at normal screen brightness levels most people would say 5000 is dark and warm or yellow. Now if you are using this monitor as a calibrated piece of equipment with print matching and controlled viewing boxes that is a different story. We use high quality viewing stations that are set to 5000 and as such we must have our white point set to match this in order to soft proof properly. If we did not do this then everything would look too warm comparing the prints in the viewing station and the screens.
We also match the brightness of the viewing stations to the brightness level of the screen so our density judgments are not thrown off.
I still am not sure if everyone who reads this understands sRGB and color spaces. It is not the easiest concept to get your arms around. For that matter profiles themselves are pure math in action and we all know how exciting that is to talk about at parties. I'll save this for another post if anyone is interested.
One thing that is most important with profiles is white point as it is from end to end in digital photography and old analog photography as well.
Something everyone might want to ponder is from ages gone by when a new film came out, say velvia as I think this was mentioned it had allot of color saturation more than allot of other films but did we say after looking at velvia that the other films were out of calibration? No, this was the look of Velvia and either you liked it or you didn't and maybe you used it for some projects and not for others. This also applies to digital and color spaces and calibrations. Remember that in the end all output devices have a set gamut and can only produce within that range. We have to interpret in the image what the real life color was and then translate that to something the output can reproduce perceptually pleasing and what our memories tell us it looked like. This is the function of profiles to squeeze all of that data into whatever space we need to get the job done.
Well probably I've created more questions than answers.
Carry on netCitizens
1890 Beaver Ridge Circle
Norcross, GA 30071
W : 678-405-5500 ext 5620
C : 678-790-4553
John, you're a master at keeping the most important goals in focus.
I've gotten a few emails from the Apple guys saying my old boss would like to see an email from me with a really crisp suggestion for what to do about this mess we've created. As this thread points out so clearly, all of us have points of confusion despite how much time we've devoted to it. Imagine the plight of the consumer.
Here's a trial balloon for what we should tell Steve:
1. Let's ship the Mac with gamma set to 2.2. It's the Internet and TV standard. No Apple person is defending 1.8 anymore and the web is full of people suggesting 2.2, including Apple's own pages and the monitor calibration vendors.
I'll show him a Pixar clip on an iPhone, which seems to have a 2.2 gamma (can someone confirm?) and looks good, compared to Macs where his films look washed. That oughta cause his head to explode all over the room.
Spielberg and Lucas spend gobs finessing over getting their films just right for TV. Yet the owner of his own film studio is the guy most responsible for making their films looked washed.
(I'm intimately familiar with the decision to go 1.8 gamma, 'cus I worked for Steve in the days of dim monitors and color gurus who envisioned the web as a pre-press medium instead of the consumer medium it has become.)
2. Create a tag for color space info. Glomming a bunch of fatty ICC profiles on a page would be like Andrew handing out a chapter of his book instead of his business card to people he meets. Anyone who writes code and Steve have gotta say, "no brainer."
3. Default to sRGB for untagged elements. HTML and CSS only know for sRGB, so rendering them in something else is expressly rendering them in something the designer did not intend. Same for all Flash designed to this point, 99.999% of all jpegs, and pretty much all gifs and pngs.
If people want aRGB or ProPhoto, they can tag their images. That .1% of the population would know how.
One. Two. Three. Simple. Respects the consumer network the Internet is and the standard that artists create their work for, and yet lets people with wider gamut lust have wider gamut.
No?
This sounds like a good summary of the asks.
If I understand what you're proposing and understand how tagged images interact with gamma settings, wouldn't item 2 will allow you to fix everything on Smugmug for people running the fixed version of Safari? And, isn't item #2 free of any tradeoffs at Apple or for customers (meaning there's no downside to it)? It has no backward compatibility issues. Only sites who chose to implement the new tags will get the new behavior. Since FF3 has already implemented some of these new tags, once you put the new tags in Smugmug, you'd be set for both new browsers.
I agree with items 1 and 3 as the "right" thing to do and if they'll do them that's great, but I'm sure there is a little heartburn or resistance about them. It may be so accepted that item 1 is wrong that they'll finally just fix the gamma issue, but I can certainly see how some existing things may have adapted to the 1.8 setting and those may seem broken on new macs once they switch.
Item 3 IS the right thing for the internet as a whole. While an untagged image is mystery meat (to borrow Andrew's phrase), an image on the open internet has ZERO chance of being in your own monitor profile. Your monitor profile is specific to your monitor or specific to the model and brand of monitor if you haven't profiled your display and have installed some default display profile. The image being displayed was put on the internet by someone who knows nothing about your system so it can't possibly be in your monitor profile. So assuming the monitor profile is probably the worst choice there is.
Assuming sRGB at least has a large chance at being right. The only time assuming sRGB would be worse than it is today is if the monitor profile is somehow corrupted and contains bad data. Then, using it to translate colors to your monitor would be worse than not using it at all (which is what Safari does today for an untagged image).
I can imagine that there is a bit of heartburn about changing item #3 because it won't match Flash until Flash also starts assuming untagged images are sRGB and implements color management.
So, the point of this posting was mostly to say that you might want to change the order of your asks. I think you can solve your problem for new versions of Safari if they implement #2 and it really has no downside at all for Apple. The other two items are also good things to do, but aren't required for you to solve the Smugmug issues in Safari and may be harder things for Apple to decide to deliver because they involve some short term tradeoffs.
I'm making an assumption that an image in Safari with an ICC profile displays properly whether the Mac gamma is set to 1.8 or 2.2. The monitor compensation that happens in color-managed Safari will accurately compensate for the different gamma settings. If I've got that wrong, then you would need item #1 too in order to get things working by default on new Macs.
Homepage • Popular
JFriend's javascript customizations • Secrets for getting fast answers on Dgrin
Always include a link to your site when posting a question
Video is becoming very important to us and I know it is to Apple. If you save your production in various ways like with iMovie, Final Cut, or After Effects and play it various ways like with Flash or QuickTime, you find out in a hurry that it's a mess too, mostly due to non-standard gamma.
I get the feeling that the people in this thread mainly deal with stills. The Hollywood boys are always howling about the gamma and posting stuff like this in the forums:
for the 6 millionth time
your mac displays are 1.8 gamma- PCs and TVs are 2.2
To work in Video it is best to change the gamma on your computer display in System Preferences > Displays to correctly display TV gamma
gary adcock
Studio37
They don't want to hear about your mileage varying or profile fairy dust, they want fidelity with SMPTE 170M and its equivalents for HD.