I used to have both my monitor and my TV connected to my pc.
My old config was this: VGA cable from onboard VGA to my monitor.
VGA from my old graphics card (GeForce 8300 GS) to my TV.
(TV was used as an extention of my desktop)
In the Bios, I had the primary visual set to the onboard device in stead of PCIe.
Always worked like a charm.
I recently replaced my graphcard by a Radeon 4890 HD Vapor X.
Ever since, the resolution was on my monitor (from my OnBoard VGA to my monitor was giving a resolution of 1280x800 in stead of 1440x900, and it refused to budge in the control panel!)
Really weird, since this cable was untampered with.
So, I decided to change the BIOS settings from OnBoard to PCIe, and hooked up my graphcard to my monitor. Now the resolution was like it should, 1440x900.
Since I had to deactivate the OnBoard VGA in the BIOS because of resolution problems, I could no longer view it on my TV
So I decided to use my DVI-I of my graphcard, using this little converter
to be able to connect my VGA cable to the VGA on my TV. (TV only has a VGA and a Scart
Anyway, the TV is not recognized, and settings in control panels do not seem to be helping.
Should I be activating the DVI signal somewhere or what?
All suggestions are appreciated.