A Site for Photographers by Photographers

Community > Forums > Digital Darkroom > Monitors > "Shadow" images with new LCD...

"Shadow" images with new LCD monitor...?

Ricardo Trindade , Nov 04, 2006; 02:16 p.m.

Hi folks,

I recently bought a ViewSonic VX2025WM, to replace an old 17" CRT.

It seems great except that I see a slight "shadow" effect of the image. For example, on this text that I'm writing on the white background, the last letter of each word is repeated in a faint grey to the right of it. It is very faint, but it is there. I see "monitorr" instead of "monitor".

I'm a little annoyed, but I hope it's something that can be adjusted. I'd hate to have to mail it back to Newegg, but I will if I have to....

Anyone know of an adjustment that might get rid of this problem?

Thanks in advance!


    1   |   2     Next    Last

J C , Nov 04, 2006; 02:35 p.m.

I am in the same boat Ricardo, except mine happened when I bought a new graphics card (Nvidia). I have about 5 ghosts to the right of my writing or images. If I go to 800x600 resolution this fixes it but that resolution is no good. I have a sony 21" CRT and had the same problem when I bought my last graphics card too (ATI) but that seemed to fix itself, I guess when I installed a later graphics card driver. So I would say it is a conflict of some sort between your graphics card and you monitor.

Karl Martin , Nov 04, 2006; 05:38 p.m.

Sounds like you are using a VGA (analog) connection, rather than DVI (digital). Sometimes you get a good signal with VGA, sometimes you don't..it can be a crapshoot, depending on the quality of the cable, the video card, and how much electrical noise there is in the vicinity. You can try using a higher quality cable, but you are better off using DVI if you can. It is pretty much guaranteed to give you a sharp picture.

Alternatively, try adjusting the refresh rate. What worked for your CRT is probably not good for your LCD. On a VGA signal, 60 - 70 Hz is usually best for LCD.

Ricardo Trindade , Nov 04, 2006; 05:55 p.m.

Thanks for your help, guys.

My graphics card only has one connection - for the VGA cable. The monitor also came with a "digital" cable... is this a question of upgrading the card...? Is it worth it?

I changed the refresh rate (from 60 to 75 Hertz), and it didn't improve anything.

I was able to greatly reduce the shadow by increasing the contrast of the image, but that seems to lose some detail in the brighter colours, and doesn't eliminate the problem - it only makes the shadows fainter. This doesn't sound ideal to me but it helps for now.

Any further help would be appreciated.

Karl Martin , Nov 04, 2006; 06:06 p.m.

It is possible that your monitor is faulty. If you can, try testing it on another computer (ideally, both the analog and digital connections). If the digital looks just as bad on another computer, then there is definitely something wrong with your monitor. If the analog looks just as bad on another computer, it may be either the cable or the monitor. Then, if you can, try another cable... It can be a difficult to completely isolate the problem.

Presumably, you were getting a sharp picture on your CRT, so your environment shouldn't have too much electrical noise to achieve the same on the LCD...a new cable might suffice. But, on some LCDs, I suspect that the analog signal path is is not as shielded as it should be, almost forcing you to go digital.

Since your video card only has a VGA connector, you would have to get a new one to use the DVI (digital) connection on the monitor. If the monitor is not faulty, and a better cable does not give you a better picture, then you might have to go this route. If you don't need a high-end 3D gaming card, you can get a new one on the cheap (less than $100). DVI ports are standard now.

Karl Martin , Nov 04, 2006; 06:13 p.m.

I should add, to try and isolate the problem as a potentially shoddy/not-very-well shielded analog cable, move the cable around a bit while watching the picture on screen. If it changes rather significantly, then it is probably the cable. Also make sure both ends are fastened securely.

Erik de G. , Nov 05, 2006; 01:21 a.m.

Never use an LCD without the DVI connection. Unless it's just for showing palettes and such.

DVI makes sure that every pixel is driven separately directly from the computer. Ghosting is virtually impossible when you use it.

Bruce Rubenstein - NYC , Nov 05, 2006; 09:07 a.m.

When you use the VGA connector, the signal goes through and extra digital/analog and analog/digital conversion at the video card and then the monitor. The quality of the electronics on the card and in the monitor have a significant influence on what you see on your monitor. If you are seeing ghosting, then a card with DVI output should cure it.

FWIW, I am driving my monitor via VGA (Matrox G550/NEC LCD2070NX - the G550 doesn't support the native 1600x1200 of the monitor in digital mode) and have no ghosting.

Ricardo Trindade , Nov 05, 2006; 04:11 p.m.

Yesterday, after I posted the initial message in this thread, the problem became much fainter, and I thought I could live with it. Right now, it's much worse... I guess this sounds like the "electronic noise" comment above (by Karl).

Btw, you can see a picture of this shadow here: http://photos1.blogger.com/blogger/7539/3507/1600/Clipboard02.jpg

If so, can anyone recommend a decent card? I do mostly graphics apps, no games, but the occasional video, etc. I'm also cheap by nature... ;)

Thanks again to all for your help.

Dan McCormack , Nov 07, 2006; 03:53 a.m.

Hi, Ricardo. Last week I bought a very similar monitor to yours -- a ViewSonic VX2235wm. It's the same as yours except in 22" (and I love it, you made a great choice). I originally had it connected with DVI, and the display was perfect. I then had to switch to VGA (because I use a KVM switch to use the same monitor with multiple computers, and my KVM switch doesn't support DVI), and I now have the ghosting problem. So you can be reasonably certain that the problem is with the VGA connection. You're lucky that the ViewSonic includes a DVI cable -- most of the other monitors I was considering don't include one, and DVI cables can cost over $50 to buy separately.

I agree with the other posts -- you should go down to Circuit City or Best Buy and pick up a cheap video card. Be sure it has a DVI (digital) output, and at least 64 mb (ideally 128 mb) of video memory. The memory is important because you want to be sure it can comfortably handle your monitor's native resolution of 1680x1050, since the image quality drops significantly if you run an LCD at a resolution other than its native one.

The other question is whether to get a card that uses PCI (worst), AGP (better), or PCI-Express (best) as its interface. If you know which interfaces your computer supports, then get a card with the best interface your computer can handle. But if you're not doing anything demanding like gaming or 3D rendering, then you probably won't notice any difference at all between the three interface types, so it might be safer (and definitely cheaper) to stick with PCI, which all modern computers support. Note that PCI and PCI-Express are different and not interchangeable, so be sure you don't get a PCI-Express card by accident if that's not what you want.

As for me, I bought a VisionTek Radeon 9250 card with a PCI interface and 128mb of memory. It cost $79 at Circuit City and has no problem running my VX2235wm screen at 1680x1050.

    1   |   2     Next    Last

Back to top

Notify me of Responses