Tech Support Forum banner
Status
Not open for further replies.

[SOLVED] 2nd Monitor detected but not displaying

204K views 9 replies 6 participants last post by  Ignaddio  
#1 · (Edited by Moderator)
Hello all,

I need some ideas about how to solve this issue, I have tried everything I can think of.
I have had this second monitor (which is a TV) for a while now and this has become its only purpose. I recently got a large TV, and just for display purposes, I plugged in the large TV for a minute, because I needed something to give me a realistic idea of what it would look like, as I do a lot of editing. I have done this a couple of times without a problem. Suddenly, after I plugged the vga back into the 2nd monitor, it stopped displaying images but remained detected by the computer.
This has happened before and somehow randomly went away. I'm not sure if I fixed it on accident, but it has done it again and I cannot seem to get it to display anything.

I know it's not the vga adapter because it worked with the other TV, and since this has happened before, I can't be certain that it's the 2nd monitor's problem. If I had to guess, I think the computer is getting confused in some way and I need to set it straight.

I have tried reinstalling drivers, among other things, with no luck. Any ideas?

Thank you in advance.
 
#4 ·
Re: 2nd Monitor detected but not displaying

Right click on an empty part of the desktop, select Resolution from the list.
In the resolution window select the #2 monitor from the drop down list, and set the resolution to one the monitor supports if it's not correct.
Then click on advanced settings, on the monitor tab set the resolution.
 
#5 ·
Re: 2nd Monitor detected but not displaying

solved: ok what i did was just make the 2nd monitor inactive changed to the correct resolution before extending the displays, that seemed to do the trick. I think it was because it was plugged into the big tv it still wanted to display in 1920x1080 and my little tv was not havin it.

thank you
 
#6 ·
Hi all. I know this is now marked as solved, but after spending days trying to get my 2nd monitor working, i thought i'd also offer a solve i discovered.

I have 3 monitor outputs on my machine, dvi (d), VGA and a mini HDMI. My primary monitor always used the VGA output and my 2nd monitor was running through the dvi output with a VGA adaptor...until it died. So i bought a twin set of Samsung LED SE390 24inch monitors.

No worries with the VGA output, but the 2nd monitor just wouldn't work. Eventually, after much faffing, Windows(7) finally detected the monitor but no display. Tried all the setting i could find, updated my NVidia drivers. Nothing. Detected but not displaying. I assumed that the tech advice i had been given was correct that "all dvi connections are the same", which is not true. I was also given the impression that if a monitor was "detected" the cabling was in fact sending signal. Not quite that simple, as it turns out.

After messing around for days, testing the screens on other PCs and swapping the combinations of every conceivable component, i bought a HDMI to miniHDMI cable for my 2nd monitor and the problem was solved. In retrospect, an easy solve, but the hours of frustration i experienced, and the countless tech forums i visited to find a solve turned out to be unnecessary (though i must say, there is great information out there :) I hope someone out there finds this solve useful :smile:
 
#10 · (Edited)
This was the first result for a Google suggested query (monitor detected by no display) when I ran into a similar problem with my work computer so I'm going to tack my problem and the solution I came up with to the end of this thread in case it happens to anyone else.

TL;DR: Use Intel HD Graphics Control Panel (or applicable graphics card program) to add a custom resolution and refresh rate that is your monitor's native display.

Slightly longer explanation:

Briefly, the problem is as described above: the control panel "display" program recognized that the monitor was plugged in, but the monitor would not display anything. Similarly, the cause was the same: the display settings for the monitor were incompatible with the monitor. However, there was an added wrinkle: the control panel "Display" program wouldn't allow me to change the settings for the monitor that wouldn't display. Using the Intel HD Graphics Control Panel, however, I was able to add a custom resolution (my monitor's native resolution, 1920x1080) and restore dual-screen functionality.

The problem in full:

Hardware:

HP EliteDesk 800 G2 Small Form Factor PC
Intel i7 6700 @ 3.4 GHz
Intel HD Display (onboard graphics)
Dell E198FPf 19 inch Monitor (4:3 aspect ratio) attached at VGA Port
HP EliteDisplay E232 attached at DisplayPort

OS: Windows 7 Pro SP1 downgraded from Windows 10 Pro

I am a non-admin user trying to use a dual display setup with my old VGA monitor and the Display Port monitor that was shipped with the new computer. The computer had no problem utilizing a display plugged into the VGA port, but every once in a while the second monitor plugged into either one of the DisplayPorts wouldn't work after returning from sleep mode. I could get it to come back by shutting down and restarting, but this isn't compatible with my needs for this computer.

After some googling (and finding this thread) it becomes apparent that it is likely that the display settings and the monitor are not compatible. The solution that worked for me is as follows:

Right click on the desktop, select Graphics Properties (not Screen Resolution)
Image


Intel HD Graphics Control Panel starts. Select "Multiple Displays" tab from the left column.
Image


Set up your screens how you want them. I have them set up to mimic the actual monitors on my desk, but everyone's a snowflake so have at it.
Image


Click "Custom Resolutions" from the column at left. This brings up a warning. Read it, acknowledge that if you make an error you might fry your CPU. You're probably fine though if you are using the monitor that shipped with your tower.
Image


Enter the Native Resolution and refresh rate for the second monitor you want to use, click Add. NB: The native resolution and refresh rate I want to use is well within the stated capabilities of the CPU I'm using. I don't know why it wasn't one of the default options at first.
Image


Select "General Settings" from the column at left. Select the custom resolution you just made from the drop-down menu, ditto with refresh rate, click Apply.
Image


NB: This is what worked for me, it may not apply to you, particularly if you are using a dedicated graphics card. I only post it because I think my situation is relatively common/generic.
 
Status
Not open for further replies.
You have insufficient privileges to reply here.