Tech Support Forum banner

[SOLVED] Have lost DVI-D signal to monitor

32941 Views 11 Replies 3 Participants Last post by  JimE
I have my PC connected via DVI/HDMI adapter to my TV and via DVI-D cable to my monitor and everything has worked hunky dory thus far.

The other day I was running the PC GFX to the TV watching a movie and decided to unplug the monitor power cable to free up a power socket.

When I plugged it back in the gfx card hasn't detected the monitor there.

I've tried several tests:

1) Have tested both GFX card DVI sockets individually to the TV via HDMI, both sockets work fine.
2) Have ran HD video to the monitor from HD cable box via HDMI to DVI adapter, Monitor display the video fine so DVI input socket on monitor is fine
3) the monitor works fine using a DVI to vga adapter at the gfx card and then a vga lead to the vga input on the monitor.
4) have uninstalled and reinstalled the GFX driver, have tried hot plugging the monitor as well as booting up PC with just the monitor on its own plugged in to try and force PC to recognise it.

The problem gets a bit wierder too.....

I run dual boot between XP and win 7. Running Win 7 if I have the TV in one DVI socket and the PC monitor in the other via DVI-D cable, it simply reports that one display (the TV) is plugged up.

If I run windows XP it reports that I have 1 display plugged in, but if I switch the refresh rate down to 50Hz it reports that I have 2 displays plugged in but the monitor is shadowed and if I click on enable it refuses to enable it.


When the monitor is plugged up to the PC via DVI-D it reports no signal found.

I kind of suspect that this may be a refresh rate issue....I'm pretty sure that when the PC is running DVI-D correctly to the monitor it comes up as 1680x1050 at 59Hz refresh rate and if I could force the refresh rate to 59Hz I kinda get the feeling that it would sync with the monitor. I'm only basing this on what I saw on XP when I switched from 60Hz to 50Hz and got the monitor showing as available but disabled.....so not basing it on much! To be honest am a bit stuck now on what to do next! Anyone got any ideas?

GFX card is sapphire ATI4870 Toxic and monitor is GMW 22" X2210WPS. Monitor has no drivers installed as there doesn't seem to be any available apart from generic windows ones and these worked fine until I unplugged the monitor while the PC was still running.

The really annoying thing is I had this problem once before....(DUH!!) when I did exactly the same thing, unplugged the monitor while the PC was still on. I managed to sort it after a day or so but cannot remember what I did to get the PC to kick the monitor into life again!

Can anyone suggest anything? Thanks in advance!!!
See less See more
Status
Not open for further replies.
1 - 12 of 12 Posts
Re: Have lost DVI-D signal to monitor

Check which monitor is now set as primary. Windows may have defaulted to the TV as primary.
Re: Have lost DVI-D signal to monitor

Thanks for the reply. I've tried a number of different configs with single and multiple displays plugged in, essentially just to prove that both DVI outputs are actually working on the GFX card. With TV and monitor plugged up it does default to the TV as the primary display and doesn't recognise the monitor.

If I unplug the TV and just use DVI-D to the monitor, either from boot up or hot plugging the display on the monitor remains blank.

Some additional info:

Swapped the DVI-D lead for another one and same result so not defective lead.

Removed graphics card and went back to on board graphics card and still get same problem.

The monitor itself seems to recognise that something is plugged in as it gives "No signal" message, if I unplug the DVI from the monitor it gives "Monitor Disconnected" message. :4-dontkno
See less See more
Re: Have lost DVI-D signal to monitor

check the settings in ccc
Re: Have lost DVI-D signal to monitor

Uninstall the video drivers using Add and Remove programs, shutdown the PC, connect the monitor to the desired output, and power up the PC. If the monitor is properly recognized and functional, then you can reinstall the latest drivers. After installing the latest drivers, shutdown, connect the TV, power the TV on, and boot the PC. You can then use CCC to setup and configure the TV.

On a side note, this is a side effect of making hardware changes while the PC was powered up. When the monitor lost power, Windows had to reconfigure the display settings to maintain operation.
Re: Have lost DVI-D signal to monitor

Uninstall the video drivers using Add and Remove programs, shutdown the PC, connect the monitor to the desired output, and power up the PC. If the monitor is properly recognized and functional, then you can reinstall the latest drivers. After installing the latest drivers, shutdown, connect the TV, power the TV on, and boot the PC. You can then use CCC to setup and configure the TV.

On a side note, this is a side effect of making hardware changes while the PC was powered up. When the monitor lost power, Windows had to reconfigure the display settings to maintain operation.
Thanks for the replies and advice. Yes it appears that unplugging a monitor while the PC is up and running is not a good idea!!

Well to update, have taken my monitor into work and tried running it on DVI-D from work PC and it won't run on that either, despite the fact that I usually have a second monitor plugged up to my laptop at work in this config. The only way I can currently get the monitor to display HD digital via the DVI-D socket is to use an HDMI to DVI adapter and run an HD video signal to do it. If that didn't work I would suspect the monitor was defective.

I'll have a play tonight with uninstalling the drivers which I tried before (removing all ATI software and then rebooting with just the monitor plugged in to try and force detection). The questions this raise are that if the monitor won't work on DVI-D on any pc (like my work one) the monitor seems to be the problem here. It has gone into a kind of "Sulk Mode" because of being unplugged while still "handshaked" with my home PC.

I would suspect that it had a fault if it wasn't for the fact that this happened once before.

Also on the catalyst control centre/ati display driver front, I can't even get a picture at boot stage (bios screen) so even the basic bios display driver isn't kicking it into life, the only thing that will get it displaying is to run the HD cable signal into the DVI socket, at which point it displays HD tv perfectly...

I'll try what has been suggested about removing all ATI drivers and trying to force windows to detect with just the monitor connected.
See less See more
Re: Have lost DVI-D signal to monitor

If it's not detecting properly on another PC either, I would suspect faulty hardware. You may also want to try a different cable.
Re: Have lost DVI-D signal to monitor

If it's not detecting properly on another PC either, I would suspect faulty hardware. You may also want to try a different cable.
Yes it has occured that there's something wrong with the monitor, and I've tried two different DVI-D single link cables, however there's two things going against this at the moment:

The monitor will display digital video if I plug an HDMI to DVI-D converter into the DVI input on the monitor. Displays an HD cable picture perfectly.

The second is that a couple of times yesterday when I was trying it with my work setup I did get a display up on it as a secondary display.......Windows and the actual GFX card control software weren't recognising it as a second monitor plugged in but it was displaying as an extended display...but not reliably. Couldn't get it to do it at home though.

So I am kind of assuming that there's a problem with the EDID part of the monitor. If I stream HD vid to it, it'll display as it doesn't need to handshake with the cable box, however if I use it on a PC then the handshake can't be done, the monitor can't pass data to the PC and the PC refuses to use it...

However, like I said in an earlier post, I had this problem once before when I did exactly the same thing, and I can't for the life of me remember how I unlocked the monitor and got it working again, but I did somehow!

Well, unless anyone can think of anything else....I'm stuck with VGA...Not great as win 7 then identifies the monitor as a CRT monitor and gets the resolutions wrong...hence why I switched to DVI-D in the first place! DOH!

Thanks for the advice :grin:
See less See more
Re: Have lost DVI-D signal to monitor

asked some one to have a look at it that may have a suggestion
Re: Have lost DVI-D signal to monitor

The fact that it is happened in the past and you were able to "fix" it, doesn't really mean much. It could just as easily mean that is when it started to fail, and now it has failed to the point that it is no longer reliable. As noted yourself in how it only partially works and you can't consistently get it to function.

You have already proven that it can't be anything other than a monitor issue.
Re: Have lost DVI-D signal to monitor

Well...not really solved but replaced it with 24" monitor so no longer a problem. My sage advice.....Never unplug your monitor when your PC is flashd an running!

the old one still works as a 22" HDTV screen (very annoying, almost mocking!!) so not a complete rag!! Many thanks for all the wise advice! Some problems I guess just aren't solvable..well without a bit of cash! :wave:
Re: Have lost DVI-D signal to monitor

Glad you got it "fixed" :grin:

I always use situations like those to upgrade as well.
1 - 12 of 12 Posts
Status
Not open for further replies.
Top