My computer seems to destroy my monitors

Niwil

Geek Trainee
two years ago I bought a new gamer pc. AMD Phenom quad core with ati hd 4850 graphics card with 512 MB ram, the computer has 4 GB ram and OS is vista 64 bit.
After half a year my samsung syncmaster one day broke down. It faded and went very dim. You could still see a change in the display when you pressed the windows butten for instance. I sent it to samsung and got a new monitor. During the next year the new monitor did the same thing twice, but after turning the computer off and on again it went away. Until 2 months ago. Then the faded screen stayed. It doesnt help to change the screen to a different computer.
Thinking it was the specific model of the samsung syncmaster that had this problem I bought a new led screen for the computer. It looks really great but after only one day the new screen did the same thing. Fading and only dim changes when something happens on the screen (opening a new window, pressing the windowskep etc.). If I connect the new monitor to my laptop there is no problems with it. And when I change it back to my desktop computer it works fine for a while but then the problem happens again. Now I am afraid to use it with my desktop computer at all, affraid that it will destroy this new monitor as well.

What do I do. Is it really possible that my computer destroys my monitors? I think that I should mention that I changed my graphics card a couple of months before the second samsung monitor broke down. But I replaced it with a new ati hd 4850 card (The same as before,).

If you think it is the computer destroying my monitors which part of the computer should i suspect first. I am keen to blame the graphics card, but the fact that I changed that in the process blurs that theory a little bit.

Please help me people. What to do?
 
Just off the top of my head, it could be a problem with the refresh rate set on your desktop computer. Refresh rates that are set too high for the monitor could damage it. As a test try starting your PC into safe mode and run it in that for a while (it'll default to a very low resolution), see if it dims the display again. If it does, try using a different input cable (either of the same connection or try a different type).
 
Thanks for the reply. I read about the refresh rate issue another place. Windows tells me my card runs with 60 Hz and that should not be a problem for a brand new led screen. But can the actual refresh rate be different than what is displayed under screen properties?
I have had the issue with both my hdmi-cabel and a standard vga cable.
But you still think it is the refreshrate thing?
Can it be something else than the graphics card?
 
Back
Top