I've been experiencing a problem with NFS and vcenter. We're using Veeam for backups and one feature (SureBackups) uses NFS to do its testing. About three weeks ago I started getting errors with the sureback jobs saying it couldn't mount the NFS:
Error: Unable to mount vPower NFS volume (VEEAMCDP:/VeeamBackup_VEEAMCDP). VeeamCDP.domain.local: An error occurred during host configuration.
In Vcenter I get this cooresponding error:
NFS mount VeeamCDP.cunj.local:/VeeamBackup_VEEAMCDP failed: Unable to connect to NFS server.
I've been going back and fourth with Veeam support and they are saying this is a problem on the vmware side...I get the same error when trying to mount the volume using vmware. A lot of this is new to me so I'm working to figure it out while trying to solve the problem. I'd really appreciate any guidance on what to try next.
I did go over the required ports for Veeam's NFS which are all listening and set to the defaults. I can vmkping from my three ESX hosts out to the production network. There are no event log errors related to NFS (or much at all) on the Veeam server.
I've checked the firewall (tried running disabled as well), rechecked the account permissions, and made sure the service is running. That article is actually what myself and the Veeam tech were both working off of, can't find any issues with the items on that page.
So this was a good one. It turned out that someone had changed the gateway address for each of the three ESX hosts. So we didn't have proper connunications setup which ultimately caused the problem. Everyone contributed here so I'll split up the points, thanks all.