A Site for Photographers by Photographers

Featured Equipment Deals

Missing Pages: Depth of Field Read More

Missing Pages: Depth of Field

Jon Sienckiewicz offers a juiced-up User Guide for creative people via his "Missing Pages" column on Photo.net. This month covers the topic of Depth of Field.

Latest Equipment Articles

Sun Position Tracking Apps Read More

Sun Position Tracking Apps

These 5 apps, ranging in price from free to $8.99, are our top picks for tracking sun (and moon) light. Also ranging in complexity, some help you keep tabs on the ideal lighting of the day while...

Latest Learning Articles

Basic Image Development in Lightroom: Color Editing (Video Tutorial) Read More

Basic Image Development in Lightroom: Color Editing (Video Tutorial)

Learn basic HSL (hue, saturation, and luminance) color adjustments as well as split toning (adjusting color in highlights and lowlights) in this next video.


Herma Ornes , Jan 27, 2012; 06:12 a.m.

I am still recovering from a motherboard failure that happened in December. It's been weeks and I am still working on re-installing and re-assembling all the parts: 2 monitors, multiple external drives, re-install software, etc., a mayor PITA.
Last year I bought a Dell U2410 as my main monitor (Resolution is now 1920x1200). My new computer (HP Pavillion HPE) allows me to use a HDMI cable (one that supports 1080p) to connect the monitor, whearas before I had to connect with a DVI-d cable. (I don't know if that makes a difference)
My new system has a NVIDIA Geforce GT 520 with cuda graphics card (what ever that means), I am using Windows 7. I am also using an 8 year old Dell 2001FP as my second monitor, connected with the DVI-d cable. I have a Spyder 3 and I am ready to start calibrating the monitors with my new system.

My observation before even starting the calibration is that the colors on my 8 year old Dell are so much more pleasing than on the new U2410. As a preset on the U2410 monitor, I've picked Adobe, because it looked the best of all presets. I have turned the brightness down to 25%, because I remember those settings from my old system.
I have a test image file and a test print from Mpix, to help be find a starting point. I am trying to make my monitor look like the test print as much as possible, but it is really hard. I hate the way the U2410 displays the test image. The colors are washed out, the blacks are too whimpy. Now I really don't know where to start.
I assume that just running the Spyder 3 software and calibration on the baseline ugliness that is being displayed is not going to make the colors on the U2410 look good.
Any tips would be appreciated. I would not know where elso I could get some ideas.

a side-by-side comparison of both monitors. Guess which is the U2410?


    1   |   2     Next    Last

William Kahn , Jan 27, 2012; 09:03 a.m.

I'm thinking you should calibrate the monitors first, and then compare. No two monitors right off the shelf are going to look the same...

Patrick Lavoie , Jan 27, 2012; 11:00 a.m.

The U2410 is a excellent monitor use by many pro photographer i know, and some retoucher i also know as a second monitor or as there prime one... Until calibration most if not all monitor look like s***.. so start by using a good device for calibration, and see after.

Also, LCD monitor have less vibrant color vs older CRT, the glass covering the panel make everything look more vibrant less say.. but far from a real life print if not tune down ..

Dont forget to do this before calibration;

1_turn OFF / remove Adobe Gamma from your system.

2_put your new monitor to is factory default

3_install any driver you have BEFORE connecting anything

Harry Joseph , Jan 27, 2012; 12:39 p.m.

"My new computer (HP Pavillion HPE) allows me to use a HDMI cable (one that supports 1080p) to connect the monitor, whearas before I had to connect with a DVI-d cable. (I don't know if that makes a difference) "

I have an HPE Pavillion and had spent months with HP customer service trying to figure out what was wrong with the computer, since it kept crashing. Finally after about 4 months of haggling, I sent it back to HP. They replaced the mother board and everything is fine now, but once in a while it still crashes. According to HP something in Windows 7 is causing some type of incompatability and it will cost $150 to upgrade. Yeah right !

I have an NEC 221W monitor that is connected to the computer by an Analog D-sub 15 pin cable. I tried using the DVI connection, but that did not seem to work on my computer. My monitor did not come with an HDMI cable as far as I know, so maybe I need to buy one. I even called HP customer service to get some instruction and they told me to use the blue 15 pin cable.

The resolution on my screen is very accurate, but I like the smoothness and the vibrancy of my 7 year old Dell CRT a little bit better. Although my monitor came with Spectraview calibrating software, I didn't have to do much as far as calibration. Right out of the box the colors were fine, so you could be having another issue because your colors are way off.

Frank Skomial , Jan 27, 2012; 01:59 p.m.

Both HDMI and DVI are digital signals, and trying to use an analog D-Sub socket (VGA) and cable, without proper converter, is a futile activity.
This could be confusing since there are vendors selling VGA to DVI or to HDMI direct connection cables, to despair of after purchase owners.

The HMDI outputs that I encountered were as good as 1980 x 1080, but not any higher resolution was provided. It seems to be limitted and tailored to the HD TV standard, blu-ray players, and the HDMI formats on latest LCD large HD TVs.

From DVI, or even from the old analog VGA output, you usually can configure video adapter card to allow much higher resolution, if your monitor is capable of. Something that you perhaps would not get out of the HDMI interface.

Since HDMI and DVI are digitals format signals, there are simple adapters available that one can use a HDMI cable on the DVI output socket.

There was recent complain that some newest laptops only provide HDMI socket, and no VGA or DVI socket provided.
Even though the built-in NVIDIA or ATI cards are capable to produce much higher resolution, the only HDMI socket is all what they get.

Frank Skomial , Jan 27, 2012; 01:59 p.m.


Steve Dunn , Jan 27, 2012; 04:49 p.m.

I can't really help you much with the look of your screens, other than to note that every time I've changed video cards and/or monitors, what I see on the screen looks at least slightly different (assuming I've made some manual adjustments, possibly guided by something like Adobe Gamma but not something as sophisticated as a proper profiling product). But I can confirm that what you're seeing is not a result of HDMI vs. DVI-D. HDMI quite intentionally started out using the same electrical and digital specs for its video as DVI-D used, so a simple adapter that has the appropriate mapping between input and output pins can convert between the two for typical computer display purposes*.

Given a hypothetical video card with both DVI-D and HDMI outputs, and a hypothetical monitor with both DVI-D and HDMI intputs, you should see the same thing on screen regardless of whether you connect the two via DVI-D, HDMI, or a mixture of the two with an appropriate adapter.

*: There are areas where one standard supports something the other doesn't. The most obvious one is that HDMI supports audio while DVI doesn't. Both standards support many resolutions beyond 1080p, but not necessarily in ways that are compatible with each other, so if you have a video card and a monitor that both support ultra-high resolutions but use different connectors, you may be stuck. But for up to 1920x1080 or 1920x1200 output from a computer to a computer monitor, they're essentially interchangeable.

Herma Ornes , Jan 27, 2012; 07:15 p.m.

Harry, it's funny you should mention the crashing. This new computer is indeed crashing quite frequently. From Windows "trying to find a solution" to a few blue screens of death.
Another interesting find: Through windows>Settings>Screen resolution, the highest was only 1920x1080, leaving me with an unused strip at the top and bottom. I was only able to change the resolution to 1920x1200 in the Nvidia graphics card settings. Problem solved.
Then I did what Patrick told me (other than the removing Gamma bit), reset the monitor to ugly factory reset and ran Spyder 3: It is SO much better! I even Spydered the ol' Dell 2001, now my test image between the 2 monitors and the print itself is pretty darn close.
The only question I now have is: Since the HDMI only supports 1080, what happened when I set it to 1200? Did I loose some quality? Or is that all a wash anyway?

William Kahn , Jan 27, 2012; 10:23 p.m.

Herma, re the dreaded BSOD and other failure issues, check to make sure the RAM DDR modules are properly seated in their sockets...

Frank Skomial , Jan 28, 2012; 04:06 a.m.

The only question I now have is: Since the HDMI only supports 1080, what happened when I set it to 1200?
Depending on your monitor setting, there are most likely 2 cases, that none will please you.

1. The picture will be stretched onto 1200 lines, and the monitor will not work at its best, having non- native picture pixel resolution projected on it.

2. You will see dark bands of some 90 lines at the top and the bottom of the screen, but the picture quality in the center will be much sharper, better.

On monitor capable of 1200 lines, I do not use HDMI, and use the old and good VGA output socket, that my Vaio laptops have both. I playback Blu-Ray videos via HDMI on a large size HD TV. I watch pictures and home movies via HDMI connector, directly from a camera or camcorder.

    1   |   2     Next    Last

Back to top

Notify me of Responses