Alright, I'm having a problem. I've seen other threads about this issue, but I just want to be sure about something. I'm sure y'all have answered a question on this hardware a billion times, but please help and bear with me...
I have the Intel 82845G/GL/GE/PE/GV Graphics Controller. It's awful! I can't play many games on it because it's not fully compatible with Pixel Shader and Vertex Shader. I even have DirectX issues on it. I got sick of it and bought an ATI Radeon 9250 to replace it. It's not the best, but it's sufficient for the games I want to play. I've put it into the computer, and it's there now, installed and everything. I've seen a small increase in performance. I went from about 16 fps in a game (Call of Duty) to 40 fps. That's great and all, but all the games I wanted to play that wouldn't work with the Intel Graphics Card still won't work! What's the deal? I found out that the Intel Graphics Card is intergrated into the motherboard. That sucks because it still acts like the primary graphics card even though I put the new ATI card in. Now, I've read that I can disable the Intel Card. I'm kind of hesitant about that because it's fused with the motherboard, so wouldn't that disable the motherboard too? I'm also afraid to disable it because I'm afraid if I did, when I restarted the computer, my screen would just be blank since my moniter is on it. Is it safe to disable my intergrated card? Anyone else having the same problem and can pass on some words of wisdom?