Processing Ajax...

Title

Message

Confirm

Confirm

Confirm

Confirm

Are you sure you want to delete this item?

Confirm

Are you sure you want to delete this item?

Confirm

Are you sure?

User Image
Neverdies
2 discussion posts
Hi,

I'm trying to figure out why my two graphics cards aren't playing nice together. One's integrated, an Intel HD 4600, and the other's my real one, an nVidia GTX 645. I'm running a dual monitor set up with a 1920*1080 as my primary, and a 1280*1024 as my secondary.

My issue comes in where Display Fusion lists the larger primary one as Monitor #2, and the smaller secondary one as Monitor #1. I'm not sure if this is the cause or the effect, but the end result is that the smaller secondary one is using my GTX 645 (and the nVidia settings won't recognise the other computer), while my larger primary monitor is running off the integrated HD 4600, which is...struggling, with the games that I play. The Intel settings screen, also, is failing to recognise the smaller monitor's existence.

Is it a problem with how the monitors are plugged into the computer? It's a pre-setup computer, so I imagine the graphics cards are not the issue. But it's somewhat annoying that my computer is finally doing exactly what I want it to do, just on the wrong monitors.
Nov 16, 2016  • #1
Alan Wade's profile on WallpaperFusion.com
Is there a reason why you are plugged into the intregrated graphics?
If you have a GTX 645 connected by way of a PCIe slot then use that for both your monitors and disconnect the built in graphics card in the BIOS.

Using just the GTX card will allow you by way of which order you plug your monitors in, to determine which is Monitor 1 and which is monitor 2.
Nov 17, 2016  • #2
User Image
Neverdies
2 discussion posts
The idea I started off with is that the large monitor would run on the GTX and pretty much be purely for games/graphics intensive stuff so that the FPS would be good/decent and I could pump the graphics up to a respectable level. Meanwhile, on the smaller monitor, I could run less intensive but still draining stuff like Facebook or a movie that would normally cause FPS issues. I may be entirely wrong here, and my knowledge of computers isn't hugely extensive (though I want to learn), so feel free to tell me if this isn't feasible for whatever reason, but that was the original idea.
Nov 17, 2016  • #3
Keith Lammers (BFS)'s profile on WallpaperFusion.com
The monitor IDs themselves shouldn't have any effect on which video card gets used. Is the big monitor plugged into the NVIDIA card? It shouldn't be possible for the monitor to be using the onboard card if it's plugged into the NVIDIA card, so it sounds like maybe the cables are just backwards. BUT (!) I've been wrong before :)

If you could attach a copy of your troubleshooting as well, I can check it out :)
  • Open the Settings > Troubleshooting tab
  • Click the "Copy to Clipboard" button
  • Paste the text into a text file (please don't paste the text directly into your reply, the formatting gets garbled and makes it difficult to parse)
  • Reply with the file attached
Nov 17, 2016  • #4
Alan Wade's profile on WallpaperFusion.com
The FPS loss in having two screens, one for gaming and one for "everyday" use is minimal to a point where you wouldnt notice it as the GPU renders each screen independently. Obviously, make sure your gaming monitor is the main screen.
Nov 19, 2016  • #5
Subscribe to this discussion topic using RSS
Was this helpful?  Login to Vote(-)  Login to Vote(-)