Multi-monitor set-ups (PC~TV)

Avatar image for cloak3126
Cloak

22

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Hey guys, I'm sure a lot of you are no stranger to multi-display set-ups for your PCs, so I guess I'm hoping I could find something here that would point me in the right direction. Currently I have a GeForce GTX 570 in my computer and I'm running two monitors off of it; the tricky part is now I'm also trying to hook up TV using a DVI splitter, dupicating my Dell 2209WA. Currently I'm still waiting for the DVI-HDMI cable to arrive, so I decided to hook-up the splitter in advance. But for whatever reason the DVI splitter seems to be squeezing the resolution on my second monitor. I could fix this issue by going to the menu on my monitor and re-detect the input source, however the issue returns whenever I turn off and on my monitor.

So far I haven't had any luck googling this particular issue; there are many people who's second or third display won't show up using a DVI splitter, but none that I've see squeezes their resolution. Is it because I'm only using one of two ports on the splitter at the moment? And what are some of the other issues you guys have run into trying to hook up your TV? Anyways, I'm pretty clueless when it comes to these audio visual issues, any help or guidance would be greatly appreciated. Thanks.

Avatar image for monetarydread
monetarydread

2898

Forum Posts

92

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

#2  Edited By monetarydread

You aren't running Linux are you? That has been a known problem for Geforce cards. One thing that I have been reading is that multi-monitor is tricky on Windows and the OS does not like mixing sources (HDMI and DVI-D for example) on monitors with different resolutions (1920x1080 vs 1680x1050). Oh and splitters do not support different resolutions, so if your Dell monitor is 1680x1050 and you mirror that monitor to a TV, both screens will run at a lower resolution (TV's do not support 16x10 resolutions, so output will drop down to 720p).

Avatar image for hobozero
HoboZero

493

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I can't say I've tried a passive splitter before, but it seems straightforward. The thing with DVI and HDMI is, they allow the device to talk back to the source. With VGA, its purely output, but when you connect a device digitally, it reports back to the source output device things like "here is the best resolution for me to run at" and "I prefer this scan rate/refresh rate".

Now, with a DVI splitter, the output source is expecting to talk to one display device, but it's actually connecting to two. I don't know how or even if the output would be able to tell which device is responding to it's queries. It's possible that it is using feedback from only the most recent device to "talk" to it - which in this case would be the last device turned on, which probably explains why restarting your monitor fixes any issues.

As I said, I've never used one of these myself, so I could be wrong. I would think you would need some sort of active splitter rather than just a simple cable.

It's a few dollars more, but you might try something like this for 3 or more monitors. It's a bus-powered USB version of an Intel graphics card. I'm not sure you'd be able to game on it, but if you'r just looking to output to another monitor for media it would be fine.