Windows uses Generic non-PnP monitor settings on boot-up

I have two computers at my desk. I switch between them using a KVM switch and a single keyboard, monitor, and mouse. If I power up my second computer (a Win7 machine) when I'm not switched to it, the display starts up using "Generic non-PnP monitor" and a very low resolution. I have to go to Control Panel, Display, Adjust Resolution, then click on Detect, at which point the computer recognizes the monitor (Acer G235H) and resets the resolution appropriately.

Is there any way to lock in the monitor setting so that it will always assume the correct monitor and keep the resolution I usually use?
LVL 20
Who is Participating?
nobusConnect With a Mentor Commented:
normally, windows only recognises connected devices
you could try devcon rescan command :
how is it connected?  vga or HDMI?  
your switch may not support that - test with another monitor, or switch
ElrondCTAuthor Commented:
The connection is VGA.

As far as I can tell, the problem is that Windows isn't getting any response from the monitor when it starts up, so it's defaulting to a Generic monitor type. I'm asking if there's some way to tell Windows to assume that the monitor is an Acer G235H even without the monitor being connected. So the switch really shouldn't matter.
Free Tool: Subnet Calculator

The subnet calculator helps you design networks by taking an IP address and network mask and returning information such as network, broadcast address, and host range.

One of a set of tools we're offering as a way of saying thank you for being a part of the community.

ElrondCTAuthor Commented:
Devcon doesn't seem useful to my situation. It's intended for device developers, and is only available if you've installed the Windows Developer's Kit (which I have not).

So it looks like the answer to my question is that there's no way to lock the monitor setting; it's always based on a response from the device, and if no device is plugged in, it always sets it to Generic with a low resolution. I have to reset once the monitor is connected to the system.
then why not install the Kit? then at least you can test it
ElrondCTConnect With a Mentor Author Commented:
Getting the developer's kit installed seemed like too much hassle for little likelihood of payout, so I chose not to do that. However, I have discovered that if I change the resolution for the default monitor, that will be saved from one session to the next. That was what I was really most concerned about; as long as the resolution is set properly, I don't really care which monitor the system thinks it's talking to, So I'm feeling OK about the situation
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.