Link to home
Start Free TrialLog in
Avatar of jkirman
jkirmanFlag for United States of America

asked on

Carbonite backup failing with zwccontroller.exe error

Greetings,

I am supporting a physical Windows 2012 server at a client.  For several years we have had Carbonite cloud backup working without any problems.  Jobs were configured to first back up data to a local USB drive, and then Carbonite would take the resulting local compressed backup data file and upload that to Carbonite's cloud storage.  The internet provider for the client went belly up a few months ago, and the office lost internet service.  I had user PC'ss connecting to the internet using wifi adapters on their PC's and a Verizon MyFi as a local hotspot.  Very slow, but it worked.  Obviously for security purposes the server was not connecting to wifi during this period.  Internet service was finally restored through a new provider in August.  Since that time Carbonite has never worked properly.  Most cloud backup uploads failed to complete.  I have updated Carbonite to latest version, but am getting a regular error in the Application Log when doing any cloud backups of c. 15 GB and larger with the following information:

Faulting application name: ZWCController.exe, version: 6.0.0.0, time stamp: 0x5f0317b2
Faulting module name: MSVCR120.dll, version: 12.0.40649.5, time stamp: 0x56bc00d3
Faulting application path: C:\Program Files\Carbonite\Carbonite Safe Server Backup(x64)\bin\ZWCController.exe
Faulting module path: C:\Windows\SYSTEM32\MSVCR120.dll

The MSVCR120.dll file is a Visual C++ library file.  I have manually removed and reinstalled this library with the latest version / Rel 5 over the weekend, removed and reinstalled Carbonite, and the errors persist.  I can run a smaller backup job - 10 GB - to both local and cloud backup, but as mentioned when running anything a few GB's bigger than that and larger, the upload fails and throws the above error in the application log.

Carbonite tech support is saying it's a Visual 2013 C++ issue - i.e. call Microsoft - even though some jobs are succeeding.  I have another client running the same build of Carbonite and Visual C++ 2013 and not having any issues.

Any thoughts on how to approach / suggestions on troubleshooting?  Appreciate any experience on this and/or suggestions in advance.

Many thanks.

jkirman

Avatar of Ramasamy P
Ramasamy P
Flag of India image

I think you are running incremental backups for this files. Can you move all your old backup catalog and start from fresh backup? I feel the timestamp of these files not matching with old files in the backup might cause the issue. 
Avatar of jkirman

ASKER

Thank you Ramasamy .  My apologies for not responding sooner.  We are in fact running only full backups.  As part of my diagnostics I have done:

- a full removal and reinstall of Carbonite and added a couple of jobs manually.  I did not import the backup sets, which are normally saved in the cloud.
- removal and reinstall of the Visual 2013 C ++ library
- updated the Visual 2013 C++ library to release/update 5

- I ran a full disk + cloud backup of a primary data folder store of c. 78 GB.  The disk backup completed OK, but the cloud backup failed at c. 50 GB.

I decided to see if backing up to to USB disk first and then to cloud was an issue, so I began to test jobs being backed up to the cloud only.  An incremental backup job of c. 8 GB to cloud only from one of our main document stores succeeded.  

I then ran a couple of small jobs (up to 10 GB) to both disk / local USB and cloud without issues.  

I ran one of the main jobs we normally run - a documents folder store called Programs with c. 10 G of data, backed it up to disk and cloud, and that worked OK..

In my last call to Carbonite and my conv with Escalations, the tech I spoke to said that the reason for the ZWCController.exe errors with the Visual C++ may be that the C++ library is failing when the job size gets too big.  I have a hard time believing that to be the issue and think he's using that to avoid digging further into what is causing the cloud backup failures.

I have since been running a sequence of test backups of patch / install files, starting with a job size of 15 GB, and increasing each successive test job in data size by c. 3.5 GB.  Backups are being saved to both disk and cloud.  I'm trying to see if it in fact fails at a certain job size.  The last job I ran was 32 GB in size and that completed without issue. I should be getting up to 50 GB job size by tomorrow afternoon.  I am doing this simply so that when I call Carbonite back, they can not tell me that the job size is the reason for the error and will have to look deeper into the issue.

I am beginning to think that Carbonite is failing when the program is unable to read some of the data properly.

Either that or some other system function is not handling the data size when it reaches a certain point and is erroring out, and that is reported as the ZWCController.exe error.

Will update this post as I test further.

Thanks.

jkirman

Really the scenario is weird for me too. Does any of application using the library files during the backup window? Since it's happening only during longer window. is there any timestamp changes on the lib file?
Avatar of jkirman

ASKER

 Ramasamy it appears the problem was occurring when I backed up a certain set of subfolders in what is their main Quickbooks folder tree.  As you know each Quickbooks company file has an associated subfolder basically called its Searchindex.  When I excluded these folders from the backup, and only selected the QB company files themselves, the backup issues stopped.  So I suspect that the error came from Carbonite having issues reading or backing up some of the data, and when the issue occurred, it threw the ZCController.exe error.  So the error message was a symptom , kind of like a read/write I/O error.

I appreciate your thoughts and feedback and marked your comments as helpful.



ASKER CERTIFIED SOLUTION
Avatar of jkirman
jkirman
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial