hi,

i'm using a HP F70 17" LCD flatscreen monitor that i've had for a few years. recently, the cable that came with the monitor became damaged and the picture would go from red - blue - purple. i could still see the picture, but i knew i had to replace the cable. i went to best buy and picked up a new VGA cable (like my old one) and my trouble began.

i noticed that the display configuration was not saving when switching between windows/games. the auto-config button was not working properly and a weird "ghosting" effect appeared at the ends of text. i use a radeon 9500 graphics card and i upgraded drivers, switched to omega drivers, and back to default drivers. that didn't help. updated the directx and that still wasn't helping. i exhausted all software troubleshooting efforts.

i ended up testing the hardware and monitor itself by using my 10 year old CRT monitor on the computer and it worked fine. still puzzled, i compared the cables/plugs on the back of my CRT monitor, the LCD cable, and the new cable. i noticed that both the CRT monitor and LCD cable had only 14 pins and were thick and heavy. the new cable had 15 pins and was not as thick and very light. i discovered that the problem was not the monitor, the graphics card, nor the hardware (at the time) but the cable itself.

my question(s) is, what do i need to look for when purchasing a new cable. one friend told me that improper 'shielding' may have caused the interference. another friend simply said 'nah it's not the cable. your monitor is probably pooped out'

i DO have a DVI port on the back of the monitor, however, my graphics card seems to support just the VGA one. are there adaptors available for the 2?

could it be possible that my monitor is indeed "pooped out"?
does the fact that my monitor uses 14 pins instead of 15 have anything to do with it?

i noticed that both the CRT monitor and LCD cable had only 14 pins and were thick and heavy. the new cable had 15 pins and was not as thick and very light.

Bingo- it's the thickness/sturdiness/quality of the cable that you need to look for. The heavier-duty cables are better shielded (grounded) than the cheap, thin cables, and the wire they use is of higher quality. (The missing pin on the sub-mini D connector, however, is normal; it's pin #9, which carries no signal.)

i DO have a DVI port on the back of the monitor, however, my graphics card seems to support just the VGA one. are there adaptors available for the 2?

Not all types of DVI connectors carry the analog as well as the digital signals, but if your monitor has the type of DVI connector that does, VGA->DVI adapters are available.

Look at the DVI side of the adapter below; the four pins insde the cross at the right of the connector are the analog (VGA) lines. If your monitor has a connector like that, you should be in business:

[IMG]http://www.stevewolfonline.com/Downloads/DMR/Tech%20Uploads/DVI-VGA%20Adapter.jpg[/IMG]

thank you for the response. however, i was told that my monitor was the problem. the cable was tested on various other monitors (all newer) and worked fine. the old cable i had replaced had been busted for some time (causing discoloration of this screen which was resolved by jiggling the cable lol).

i was told that it was possible that the damaged cable could have caused the monitor to reject the new cable. i'm not sure how that worked. i still have faith i can get the old monitor to work but in the meantime, i'll stick with the new one i replaced it with :)

thanks again!

I'm surprised that the problem actually turned out to be the monitor, given what you posted, but hey- at least you have a healthy system now. :)

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.