<

vCenter Troubleshooting TIPS: ESXi 6.0 reports “error code: 15” during Remediate update in VUM operation

Published on
27,245 Points
24,245 Views
Last Modified:
Luciano Patrão
VCP6.5-DCV, vSAN Specialist, vExpert last 3*, Veeam Vanguard. Expertises VMware, Virtual Backups and Storage design and a active Blogger.
VMware Update Manager(VUM) “error code: 15” during ESXi 6.0 Remediate update in VUM operation
This is an error that can happen when you try to update your ESXi hosts. The configuration in this case is a Hewlett-Packard DL360 G9 with ESXi 6.0 build 3568940.
 
Using VMware Update Manager to scan it shows 17 updates to install, and the stage is 7 (the rest are older versions), but when we try to remediate the host we get this:

Remediate entity esxi721.localdomain. The host returns esxupdate error code: 15. The package manager transaction is not successful.
Check the Update Manager log files and esxupdate log files for more details.
Looking at the esxupdate.log there is some information about the locker folder:

2016-04-24T15:11:44Z esxupdate: downloader: DEBUG: Downloading from http://esxi721.localdomain:9084/vum/repository/hostupdate/vmw/vib20/tools-light/VMware_locker_tools-light_6.0.0-2.34.3620759.vib...
2016-04-24T15:12:48Z esxupdate: LockerInstaller: WARNING: There was an error in cleaning up product locker: [Errno 2] No such file or directory: '/locker/packages/var/db/locker'
2016-04-24T15:12:48Z esxupdate: esxupdate: ERROR: An esxupdate error exception was caught:
So we need to investigate in the ESXi host. In the VMware KnowledgeBase about this 'error 15' it says to double check the folder/link: Locker > Store . I checked to see if the link and folder exist and all is okay. I then checked the locker folder/link to see if the locker link is valid:
[root@esxi721:~] ls /locker/
packages    var         vsantraces


[root@esxi721:~] ls -l /
total 565
lrwxrwxrwx    1 root     root            49 Apr 23 21:23 altbootbank -> /vmfs/volumes/764b33e1-310325dd-1ebb-5399a3f70a03
drwxr-xr-x    1 root     root           512 Apr 23 21:22 bin
lrwxrwxrwx    1 root     root            49 Apr 23 21:23 bootbank -> /vmfs/volumes/2b0d84e8-332418f4-8e2c-e841fe1625cb
-r--r--r--    1 root     root        331579 Feb 19 02:24 bootpart.gz
drwxr-xr-x   15 root     root           512 Apr 24 20:08 dev
drwxr-xr-x    1 root     root           512 Apr 24 19:23 etc
drwxr-xr-x    1 root     root           512 Apr 23 21:22 lib
drwxr-xr-x    1 root     root           512 Apr 23 21:22 lib64
-r-x------    1 root     root         28377 Apr 23 21:19 local.tgz
lrwxrwxrwx    1 root     root             6 Apr 23 21:23 locker -> /store
drwxr-xr-x    1 root     root           512 Apr 23 21:22 mbr
drwxr-xr-x    1 root     root           512 Apr 23 21:22 opt
drwxr-xr-x    1 root     root        131072 Apr 24 20:08 proc
lrwxrwxrwx    1 root     root            22 Apr 23 21:23 productLocker -> /locker/packages/6.0.0
lrwxrwxrwx    1 root     root             4 Feb 19 01:54 sbin -> /bin
lrwxrwxrwx    1 root     root            12 Apr 23 21:23 scratch -> /tmp/scratch
lrwxrwxrwx    1 root     root            49 Apr 23 21:23 store -> /vmfs/volumes/56bb2e57-7933dd24-7e9c-00110a6930e4
drwxr-xr-x    1 root     root           512 Apr 23 21:22 tardisks
drwxr-xr-x    1 root     root           512 Apr 23 21:22 tardisks.noauto
drwxrwxrwt    1 root     root           512 Apr 24 20:08 tmp
drwxr-xr-x    1 root     root           512 Apr 23 21:22 usr
drwxr-xr-x    1 root     root           512 Apr 23 21:22 var
drwxr-xr-x    1 root     root           512 Apr 23 21:22 vmfs
drwxr-xr-x    1 root     root           512 Apr 23 21:22 vmimages
lrwxrwxrwx    1 root     root            17 Feb 19 01:54 vmupgrade -> /locker/vmupgrade
[root@esxi721:~]

Open in new window


I checked to see if the store location is correct:

[root@esxi721:~] ls -ltr /
total 565
lrwxrwxrwx    1 root     root            17 Feb 19 01:54 vmupgrade -> /locker/vmupgrade
lrwxrwxrwx    1 root     root             4 Feb 19 01:54 sbin -> /bin
-r--r--r--    1 root     root        331579 Feb 19 02:24 bootpart.gz
-r-x------    1 root     root         28377 Apr 23 21:19 local.tgz
drwxr-xr-x    1 root     root           512 Apr 23 21:22 vmimages
drwxr-xr-x    1 root     root           512 Apr 23 21:22 vmfs
drwxr-xr-x    1 root     root           512 Apr 23 21:22 var
drwxr-xr-x    1 root     root           512 Apr 23 21:22 usr
drwxr-xr-x    1 root     root           512 Apr 23 21:22 tardisks.noauto
drwxr-xr-x    1 root     root           512 Apr 23 21:22 tardisks
drwxr-xr-x    1 root     root           512 Apr 23 21:22 opt
drwxr-xr-x    1 root     root           512 Apr 23 21:22 mbr
drwxr-xr-x    1 root     root           512 Apr 23 21:22 lib64
drwxr-xr-x    1 root     root           512 Apr 23 21:22 lib
drwxr-xr-x    1 root     root           512 Apr 23 21:22 bin
lrwxrwxrwx    1 root     root            49 Apr 23 21:23 store -> /vmfs/volumes/56bb2e57-7933dd24-7e9c-00110a6930e4
lrwxrwxrwx    1 root     root            12 Apr 23 21:23 scratch -> /tmp/scratch
lrwxrwxrwx    1 root     root            22 Apr 23 21:23 productLocker -> /locker/packages/6.0.0
lrwxrwxrwx    1 root     root             6 Apr 23 21:23 locker -> /store
lrwxrwxrwx    1 root     root            49 Apr 23 21:23 bootbank -> /vmfs/volumes/2b0d84e8-332418f4-8e2c-e841fe1625cb
lrwxrwxrwx    1 root     root            49 Apr 23 21:23 altbootbank -> /vmfs/volumes/764b33e1-310325dd-1ebb-5399a3f70a03
drwxr-xr-x    1 root     root           512 Apr 24 19:23 etc
drwxrwxrwt    1 root     root           512 Apr 24 20:42 tmp
drwxr-xr-x    1 root     root        131072 Apr 24 20:42 proc
drwxr-xr-x   15 root     root           512 Apr 24 20:42 dev
[root@esxi721:~]

Open in new window


All is okay, so next we need to check the locker/packages folder to see if Version (in this case folder 6.0.0) exists.

[root@esxi721:~] cd /locker/packages/
[root@esxi721:/vmfs/volumes/56bb2e57-7933dd24-7e9c-00110a6930e4/packages] ls
var

Open in new window


The folder doesn't exist, and there are no floppies or vmtools folders that have all the files that ESXi and VUM need for the updates. The VMware KnowledgeBase recommends to delete the old folder and links and recreate, but in this case we don't need to delete anything, but we do need to recreate and copy the necessary files(we will use another ESXi host with the same build).
 

Connecting to another host we will use SCP to copy all files to this ESXi host. First if you don’t have your SSH Client enabled in the host firewall, you need to enable to do the next task using SCP command. To enable SSH Client in the source ESXi host:

[root@esxi720:~] esxcli network firewall ruleset set --enabled true --ruleset-id=sshClient

Open in new window


Note: Don’t forget to disable SSH Client after doing this task. After you run the SCP command you will be prompted for the root password of the remote host and once you have successfully authenticated the files will copy.

[root@esxi720:~]:/vmfs/volumes/566a92b0-97db4da2-c8be-00110a69322c] scp -r /locker/packages/ root@esxi721:/locker
The authenticity of host 'esxi721 (esxi721)' can't be established.
RSA key fingerprint is SHA256:bkmqdMHuJgAWEA5s96pWOTDJO3B7FxUzgJ0t0BnqFeM.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added 'esxi721' (RSA) to the list of known hosts.
Password:
pvscsi-Windows2003.flp                                                                100%  118KB 117.5KB/s   00:00
pvscsi-Windows2008.flp                                                                100%  122KB 122.0KB/s   00:00
pvscsi-WindowsXP.flp                                                                  100%  114KB 114.0KB/s   00:00
vmscsi.flp                                                                            100%   42KB  41.5KB/s   00:00
solaris.iso                                                                           100%   13MB  12.8MB/s   00:01
solaris_avr_manifest.txt                                                              100%   49     0.1KB/s   00:00
darwin.iso.sig                                                                        100%  256     0.3KB/s   00:00
winPre2k.iso.sig                                                                      100%  256     0.3KB/s   00:00
windows.iso                                                                           100%   87MB  14.4MB/s   00:06
linux_avr_manifest.txt                                                                100% 1738     1.7KB/s   00:00
freebsd.iso                                                                           100%   15MB  15.1MB/s   00:00
netware.iso                                                                           100%  528KB 528.0KB/s   00:00
winPre2k.iso                                                                          100%   13MB  13.4MB/s   00:01
windows_avr_manifest.txt                                                              100% 1069     1.0KB/s   00:00
solaris.iso.sig                                                                       100%  256     0.3KB/s   00:00
darwin.iso                                                                            100% 3022KB   3.0MB/s   00:01
winPre2k_avr_manifest.txt                                                             100%   49     0.1KB/s   00:00
windows.iso.sig                                                                       100%  256     0.3KB/s   00:00
linux.iso                                                                             100%   71MB  11.8MB/s   00:06
scp: /locker/packages/packages/6.0.0/vmtools/linux.iso: truncate: No space left on device
scp: /locker/packages/packages/6.0.0/vmtools/linux.iso: No space left on device
scp: /locker/packages/packages/6.0.0/vmtools/netware.iso.sig: No space left on device
scp: /locker/packages/packages/6.0.0/vmtools/freebsd.iso.sig: No space left on device
scp: /locker/packages/packages/6.0.0/vmtools/tools-key.pub: No space left on device
scp: /locker/packages/packages/6.0.0/vmtools/linux.iso.sig: No space left on device
scp: /locker/packages/packages/var: No space left on device
scp: /locker/packages/packages/db: No space left on device

Open in new window


Only when trying to copy the files do we find the real issue; there is nothing in the logs related to this. There is a space issue in applying the updates, so we need to double check the root space.

[root@esxi721:~] stat -f /
  File: "/"
    ID: 100000000 Namelen: 127     Type: visorfs
Block size: 4096
Blocks: Total: 655532     Free: 455845     Available: 455845
Inodes: Total: 524288     Free: 519299

[root@esxi721:~] vdf -h
Tardisk                  Space      Used
sb.v00                    139M      139M
s.v00                     330M      330M
net_tg3.v00               300K      298K
elxnet.v00                508K      505K
ima_be2i.v00                2M        2M
....
-----
Ramdisk                   Size      Used Available Use% Mounted on
root                       32M      248K       31M   0% --
etc                        28M      240K       27M   0% --
opt                        32M      368K       31M   1% --
var                        48M      740K       47M   1% --
tmp                       256M        5M      250M   2% --
iofilters                  32M        0B       32M   0% --
hostdstats               1303M        2M     1300M   0% --
stagebootbank             250M      191M       58M  76% --

Open in new window


Here I don't see any issues with the space, but see big files from the Tardisk. Checking filesystems I see that the one is used for Locker is 100% full.

[root@esxi721:] df -h
Filesystem    Size   Used Available Use% Mounted on
NFS        1000.0G 692.5G    307.5G  69% /vmfs/volumes/vol01
NFS        1000.0G 459.2G    540.8G  46% /vmfs/volumes/vol02
NFS        1000.0G 577.9G    422.1G  58% /vmfs/volumes/vol03
NFS        1000.0G 822.5G    177.5G  82% /vmfs/volumes/vol04
NFS        1000.0G 570.4G    429.6G  57% /vmfs/volumes/vol05
NFS        1000.0G 398.5G    601.5G  40% /vmfs/volumes/vol06
NFS         666.5G 363.4G    303.1G  55% /vmfs/volumes/iso-vol
NFS        1000.0G 519.6G    480.4G  52% /vmfs/volumes/vol07
NFS        1000.0G 692.1G    307.9G  69% /vmfs/volumes/vol08
vfat        249.7M 185.4M     64.3M  74% /vmfs/volumes/2b0d84e8-332418f4-8e2c-e841fe1625cb
vfat        249.7M 185.1M     64.6M  74% /vmfs/volumes/764b33e1-310325dd-1ebb-5399a3f70a03
vfat        285.8M 285.4M    488.0K 100% /vmfs/volumes/56bb2e57-7933dd24-7e9c-00110a6930e4

Open in new window


So next step is to find the big files' logs, and we also need to look inside /tmp to see if there are any dump files or other big files that are contributing to this issue.

[root@esxi721:~] find / -path "/vmfs" -prune -o -type f -size +50000k -exec ls -l '{}' \;
-r--r--r--    1 root     root     146513728 Apr 23 21:22 /tardisks/sb.v00
-r--r--r--    1 root     root     347061695 Apr 23 21:22 /tardisks/s.v00
-rw-r--r--    1 root     root      97493923 Apr 24 15:10 /tmp/stagebootbank/s.v00
-rw-------    1 root     root     15931539456 Apr 24 20:48 /dev/disks/mpx.vmhba32:C0:T0:L0
-rw-------    1 root     root     2684354560 Apr 24 20:48 /dev/disks/mpx.vmhba32:C0:T0:L0:9
-rw-------    1 root     root     299876352 Apr 24 20:48 /dev/disks/mpx.vmhba32:C0:T0:L0:8
-rw-------    1 root     root     115326976 Apr 24 20:48 /dev/disks/mpx.vmhba32:C0:T0:L0:7
-rw-------    1 root     root     262127616 Apr 24 20:48 /dev/disks/mpx.vmhba32:C0:T0:L0:6
-rw-------    1 root     root     262127616 Apr 24 20:48 /dev/disks/mpx.vmhba32:C0:T0:L0:5

Open in new window


As we can see there are some big temp files in the list, so the next step is to delete some. Note: Double check which files you want to delete. Don't delete any log files that you could need for any troubleshooting or audits. After deleting the files that we will not need (and also deleting the files that we copied from the previous ESXi host), and all of the folders inside Locker/Store folder, we can check the space.

[root@esxi721:] df -h
Filesystem    Size   Used Available Use% Mounted on
vfat        285.8M 160.0K    285.7M   0% /vmfs/volumes/56bb2e57-7933dd24-7e9c-00110a6930e4

Open in new window


We now have space around 0% and lot of free space. Copy the files again from the other ESXi host and finish 100%. Now using VUM we will scan, stage and remidiate the ESXi host and the problem is fixed. After a final reboot (from remediate) the ESXi is fully updated.
 

Hope this article can you help fixing this error that you may encounter in your ESXi upgrade.

This is the part of my "TIP Articles". So, please vote "Helpful" on this Article. And I encourage your comments and feedback.
0
Comment
0 Comments

Featured Post

Ultimate Tool Kit for Technology Solution Provider

Broken down into practical pointers and step-by-step instructions, the IT Service Excellence Tool Kit delivers expert advice for technology solution providers. Get your free copy now.

Join & Write a Comment

Teach the user how to use create log bundles for vCenter Server or ESXi hosts Open vSphere Web Client: Generate vCenter Server and ESXi host log bundle:  Open vCenter Server Appliance Web Management interface and generate log bundle: Open vCenter Se…
This course is ideal for IT System Administrators working with VMware vSphere and its associated products in their company infrastructure. This course teaches you how to install and maintain this virtualization technology to store data, prevent vuln…

Keep in touch with Experts Exchange

Tech news and trends delivered to your inbox every month