Joined
·
25 Posts
I have my PC connected via DVI/HDMI adapter to my TV and via DVI-D cable to my monitor and everything has worked hunky dory thus far.
The other day I was running the PC GFX to the TV watching a movie and decided to unplug the monitor power cable to free up a power socket.
When I plugged it back in the gfx card hasn't detected the monitor there.
I've tried several tests:
1) Have tested both GFX card DVI sockets individually to the TV via HDMI, both sockets work fine.
2) Have ran HD video to the monitor from HD cable box via HDMI to DVI adapter, Monitor display the video fine so DVI input socket on monitor is fine
3) the monitor works fine using a DVI to vga adapter at the gfx card and then a vga lead to the vga input on the monitor.
4) have uninstalled and reinstalled the GFX driver, have tried hot plugging the monitor as well as booting up PC with just the monitor on its own plugged in to try and force PC to recognise it.
The problem gets a bit wierder too.....
I run dual boot between XP and win 7. Running Win 7 if I have the TV in one DVI socket and the PC monitor in the other via DVI-D cable, it simply reports that one display (the TV) is plugged up.
If I run windows XP it reports that I have 1 display plugged in, but if I switch the refresh rate down to 50Hz it reports that I have 2 displays plugged in but the monitor is shadowed and if I click on enable it refuses to enable it.
When the monitor is plugged up to the PC via DVI-D it reports no signal found.
I kind of suspect that this may be a refresh rate issue....I'm pretty sure that when the PC is running DVI-D correctly to the monitor it comes up as 1680x1050 at 59Hz refresh rate and if I could force the refresh rate to 59Hz I kinda get the feeling that it would sync with the monitor. I'm only basing this on what I saw on XP when I switched from 60Hz to 50Hz and got the monitor showing as available but disabled.....so not basing it on much! To be honest am a bit stuck now on what to do next! Anyone got any ideas?
GFX card is sapphire ATI4870 Toxic and monitor is GMW 22" X2210WPS. Monitor has no drivers installed as there doesn't seem to be any available apart from generic windows ones and these worked fine until I unplugged the monitor while the PC was still running.
The really annoying thing is I had this problem once before....(DUH!!) when I did exactly the same thing, unplugged the monitor while the PC was still on. I managed to sort it after a day or so but cannot remember what I did to get the PC to kick the monitor into life again!
Can anyone suggest anything? Thanks in advance!!!
The other day I was running the PC GFX to the TV watching a movie and decided to unplug the monitor power cable to free up a power socket.
When I plugged it back in the gfx card hasn't detected the monitor there.
I've tried several tests:
1) Have tested both GFX card DVI sockets individually to the TV via HDMI, both sockets work fine.
2) Have ran HD video to the monitor from HD cable box via HDMI to DVI adapter, Monitor display the video fine so DVI input socket on monitor is fine
3) the monitor works fine using a DVI to vga adapter at the gfx card and then a vga lead to the vga input on the monitor.
4) have uninstalled and reinstalled the GFX driver, have tried hot plugging the monitor as well as booting up PC with just the monitor on its own plugged in to try and force PC to recognise it.
The problem gets a bit wierder too.....
I run dual boot between XP and win 7. Running Win 7 if I have the TV in one DVI socket and the PC monitor in the other via DVI-D cable, it simply reports that one display (the TV) is plugged up.
If I run windows XP it reports that I have 1 display plugged in, but if I switch the refresh rate down to 50Hz it reports that I have 2 displays plugged in but the monitor is shadowed and if I click on enable it refuses to enable it.
When the monitor is plugged up to the PC via DVI-D it reports no signal found.
I kind of suspect that this may be a refresh rate issue....I'm pretty sure that when the PC is running DVI-D correctly to the monitor it comes up as 1680x1050 at 59Hz refresh rate and if I could force the refresh rate to 59Hz I kinda get the feeling that it would sync with the monitor. I'm only basing this on what I saw on XP when I switched from 60Hz to 50Hz and got the monitor showing as available but disabled.....so not basing it on much! To be honest am a bit stuck now on what to do next! Anyone got any ideas?
GFX card is sapphire ATI4870 Toxic and monitor is GMW 22" X2210WPS. Monitor has no drivers installed as there doesn't seem to be any available apart from generic windows ones and these worked fine until I unplugged the monitor while the PC was still running.
The really annoying thing is I had this problem once before....(DUH!!) when I did exactly the same thing, unplugged the monitor while the PC was still on. I managed to sort it after a day or so but cannot remember what I did to get the PC to kick the monitor into life again!
Can anyone suggest anything? Thanks in advance!!!