I often see on mid-range business desktop PCs (e.g. Dell and HP) that the CPU virtualization settings (e.g. Intel VT-x and AMD-V) are disabled by default. I usually enable it just for the sake of having it available in case I ever want to use it.
Is there a reason why it should be left disabled? Is there some kind of security or performance benefit to leaving it disabled on regular PCs that are not going to be doing any virtualization?
If so, does the same argument apply to servers which do not need virtualization?