hi,
i'm using a HP F70 17" LCD flatscreen monitor that i've had for a few years. recently, the cable that came with the monitor became damaged and the picture would go from red - blue - purple. i could still see the picture, but i knew i had to replace the cable. i went to best buy and picked up a new VGA cable (like my old one) and my trouble began.
i noticed that the display configuration was not saving when switching between windows/games. the auto-config button was not working properly and a weird "ghosting" effect appeared at the ends of text. i use a radeon 9500 graphics card and i upgraded drivers, switched to omega drivers, and back to default drivers. that didn't help. updated the directx and that still wasn't helping. i exhausted all software troubleshooting efforts.
i ended up testing the hardware and monitor itself by using my 10 year old CRT monitor on the computer and it worked fine. still puzzled, i compared the cables/plugs on the back of my CRT monitor, the LCD cable, and the new cable. i noticed that both the CRT monitor and LCD cable had only 14 pins and were thick and heavy. the new cable had 15 pins and was not as thick and very light. i discovered that the problem was not the monitor, the graphics card, nor the hardware (at the time) but the cable itself.
my question(s) is, what do i need to look for when purchasing a new cable. one friend told me that improper 'shielding' may have caused the interference. another friend simply said 'nah it's not the cable. your monitor is probably pooped out'
i DO have a DVI port on the back of the monitor, however, my graphics card seems to support just the VGA one. are there adaptors available for the 2?
could it be possible that my monitor is indeed "pooped out"?
does the fact that my monitor uses 14 pins instead of 15 have anything to do with it?