?
Solved

CRC check with path deep > 256 chars

Posted on 2013-06-05
9
Medium Priority
?
607 Views
Last Modified: 2013-06-17
We have created a tool for filecopy based on robocopy. After the copy job we need to get a CRC check for compare source and destination files.
Robocopy does the copy job without problems with path depths rightly longer than 256 characters.
The following CRC check (self developed with PowerShell and alternativ with VBS) can't read the files in the long paths. The cause seems to be the use of WIN32.API.

Is there a way, as it also makes Robocopy internally, by passing the WIN32.API access files?
Or is there a CRC check tool which can handle long paths?

The used os can be winserver 2003 to winserver 2008r2.
0
Comment
Question by:hpnix4ever
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 4
  • 3
  • 2
9 Comments
 
LVL 47

Assisted Solution

by:David
David earned 920 total points
ID: 39224166
the crc32 algorithm is well understood, and working open source / public domain subroutines can be found online for any language you desire.

Write the code yourself by incorporating a pubic domain subroutine. Then you know what you have will work.
0
 
LVL 25

Assisted Solution

by:Coralon
Coralon earned 80 total points
ID: 39224428
My understanding is if you are accessing the paths by the "normal" path names, you will hit the MAXPATH limitation of 260 characters.  If you use the alternate syntax, you should be able use the entire path.  

That alternate syntax is something like \\?\c:\xxxxxx  

Coralon
0
 

Author Comment

by:hpnix4ever
ID: 39224786
Thanks for answer, but this describes not a solution of the real problem. The main problem is, that powershell and VBS uses the WIN32.API and his MAXPATH limitation. The code for CRC check works very fine, but we can't read files in a deep path structure. We are looking for a way to use the NT.DLL interface to r/w files. NT.DLL has not the MAXDEPTH limitation.

Use UNC names doen't solve the probleme. A workaround, but not really practicable, is to split the long path in some parts, they maped to drive.
I'm thinking the only clean solution is native use the NT.DLL for file operations.

Hans
0
Get your Disaster Recovery as a Service basics

Disaster Recovery as a Service is one go-to solution that revolutionizes DR planning. Implementing DRaaS could be an efficient process, easily accessible to non-DR experts. Learn about monitoring, testing, executing failovers and failbacks to ensure a "healthy" DR environment.

 
LVL 25

Assisted Solution

by:Coralon
Coralon earned 80 total points
ID: 39227872
I'm not a programmer.. the UNC worked for me in the command prompt to physically delete files and paths with extremely deep paths, which is why I suggested it.

Good luck!

Coralon
0
 

Accepted Solution

by:
hpnix4ever earned 0 total points
ID: 39237713
We are using unc path, the problem is not based on this.

Now we have changed the code from Powershell to the good old C++ for use the NT.DLL api native. There is no problem with the deep of paths.

Thanks for answers to all, the problem is now solved.

Regards

Hans
0
 
LVL 47

Assisted Solution

by:David
David earned 920 total points
ID: 39237739
It probably runs faster than before, also ... especially if you profiled the code and ran through the optimization.   The DLL code is never optimized and built for portability, not efficiency.

I wrote a similar application years ago and used intel's optimizing compiler & profiled it.  (Profiling is running special debug version of code with data and letting it create a bunch of log files as it ran to figure out what chunks of code get executed the most, and do branch predictions and such.  You then recompile with this log data as part of the input, run again, repeat, then do final version).

To make long story short, the optimized and re-optimized code ran about 10X faster.
0
 
LVL 47

Assisted Solution

by:David
David earned 920 total points
ID: 39237755
This IS the type of application that would benefit a great deal from profiling-based optimization, especially if you are using a multi core system and a high-end 64bit processor.

If speed IS an issue, please consider benchmarking against the original version (with < 256 byte file names so it runs), then optimizing & reporting results.

Unless bottleneck or disk I/O is the network, then you could easy at least triple performance.
0
 
LVL 47

Expert Comment

by:David
ID: 39241239
Actually this will be more efficient as the work for each CRC was previously done in a DLL, was it not?  That DLL binary code would not be optimized for the hardware user is running, as it must be written for lowest-common denominator in hardware due to portability.

So the C code IS the fastest & most efficient way to resolve the issue ... IF, that code was compiled with optimization and ideally profiled.
0
 

Author Closing Comment

by:hpnix4ever
ID: 39252567
The solution with a C++ is in this case suboptimal than all the rest of the application is powershell code and should not be translated. For a large count of files must called for each file a separate programm to CRC check. This will be eat the benefit complete.

Regards

Hans
0

Featured Post

Industry Leaders: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

New Windows 7 Installations take days for Windows-Updates to show up and install. This can easily be fixed. I have finally decided to write an article because this seems to get asked several times a day lately. This Article and the Links apply to…
This article explains how to install and use the NTBackup utility that comes with Windows Server.
This tutorial will walk an individual through the steps necessary to enable the VMware\Hyper-V licensed feature of Backup Exec 2012. In addition, how to add a VMware server and configure a backup job. The first step is to acquire the necessary licen…
There are cases when e.g. an IT administrator wants to have full access and view into selected mailboxes on Exchange server, directly from his own email account in Outlook or Outlook Web Access. This proves useful when for example administrator want…
Suggested Courses

752 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question