Solved

CRC check with path deep > 256 chars

Posted on 2013-06-05
9
602 Views
Last Modified: 2013-06-17
We have created a tool for filecopy based on robocopy. After the copy job we need to get a CRC check for compare source and destination files.
Robocopy does the copy job without problems with path depths rightly longer than 256 characters.
The following CRC check (self developed with PowerShell and alternativ with VBS) can't read the files in the long paths. The cause seems to be the use of WIN32.API.

Is there a way, as it also makes Robocopy internally, by passing the WIN32.API access files?
Or is there a CRC check tool which can handle long paths?

The used os can be winserver 2003 to winserver 2008r2.
0
Comment
Question by:hpnix4ever
  • 4
  • 3
  • 2
9 Comments
 
LVL 47

Assisted Solution

by:dlethe
dlethe earned 460 total points
ID: 39224166
the crc32 algorithm is well understood, and working open source / public domain subroutines can be found online for any language you desire.

Write the code yourself by incorporating a pubic domain subroutine. Then you know what you have will work.
0
 
LVL 25

Assisted Solution

by:Coralon
Coralon earned 40 total points
ID: 39224428
My understanding is if you are accessing the paths by the "normal" path names, you will hit the MAXPATH limitation of 260 characters.  If you use the alternate syntax, you should be able use the entire path.  

That alternate syntax is something like \\?\c:\xxxxxx  

Coralon
0
 

Author Comment

by:hpnix4ever
ID: 39224786
Thanks for answer, but this describes not a solution of the real problem. The main problem is, that powershell and VBS uses the WIN32.API and his MAXPATH limitation. The code for CRC check works very fine, but we can't read files in a deep path structure. We are looking for a way to use the NT.DLL interface to r/w files. NT.DLL has not the MAXDEPTH limitation.

Use UNC names doen't solve the probleme. A workaround, but not really practicable, is to split the long path in some parts, they maped to drive.
I'm thinking the only clean solution is native use the NT.DLL for file operations.

Hans
0
Netscaler Common Configuration How To guides

If you use NetScaler you will want to see these guides. The NetScaler How To Guides show administrators how to get NetScaler up and configured by providing instructions for common scenarios and some not so common ones.

 
LVL 25

Assisted Solution

by:Coralon
Coralon earned 40 total points
ID: 39227872
I'm not a programmer.. the UNC worked for me in the command prompt to physically delete files and paths with extremely deep paths, which is why I suggested it.

Good luck!

Coralon
0
 

Accepted Solution

by:
hpnix4ever earned 0 total points
ID: 39237713
We are using unc path, the problem is not based on this.

Now we have changed the code from Powershell to the good old C++ for use the NT.DLL api native. There is no problem with the deep of paths.

Thanks for answers to all, the problem is now solved.

Regards

Hans
0
 
LVL 47

Assisted Solution

by:dlethe
dlethe earned 460 total points
ID: 39237739
It probably runs faster than before, also ... especially if you profiled the code and ran through the optimization.   The DLL code is never optimized and built for portability, not efficiency.

I wrote a similar application years ago and used intel's optimizing compiler & profiled it.  (Profiling is running special debug version of code with data and letting it create a bunch of log files as it ran to figure out what chunks of code get executed the most, and do branch predictions and such.  You then recompile with this log data as part of the input, run again, repeat, then do final version).

To make long story short, the optimized and re-optimized code ran about 10X faster.
0
 
LVL 47

Assisted Solution

by:dlethe
dlethe earned 460 total points
ID: 39237755
This IS the type of application that would benefit a great deal from profiling-based optimization, especially if you are using a multi core system and a high-end 64bit processor.

If speed IS an issue, please consider benchmarking against the original version (with < 256 byte file names so it runs), then optimizing & reporting results.

Unless bottleneck or disk I/O is the network, then you could easy at least triple performance.
0
 
LVL 47

Expert Comment

by:dlethe
ID: 39241239
Actually this will be more efficient as the work for each CRC was previously done in a DLL, was it not?  That DLL binary code would not be optimized for the hardware user is running, as it must be written for lowest-common denominator in hardware due to portability.

So the C code IS the fastest & most efficient way to resolve the issue ... IF, that code was compiled with optimization and ideally profiled.
0
 

Author Closing Comment

by:hpnix4ever
ID: 39252567
The solution with a C++ is in this case suboptimal than all the rest of the application is powershell code and should not be translated. For a large count of files must called for each file a separate programm to CRC check. This will be eat the benefit complete.

Regards

Hans
0

Featured Post

NAS Cloud Backup Strategies

This article explains backup scenarios when using network storage. We review the so-called “3-2-1 strategy” and summarize the methods you can use to send NAS data to the cloud

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Restoring deleted objects in Active Directory has been a standard feature in Active Directory for many years, yet some admins may not know what is available.
While rebooting windows server 2003 server , it's showing "active directory rebuilding indices please wait" at startup. It took a little while for this process to complete and once we logged on not all the services were started so another reboot is …
This tutorial will walk an individual through the steps necessary to configure their installation of BackupExec 2012 to use network shared disk space. Verify that the path to the shared storage is valid and that data can be written to that location:…
This tutorial will walk an individual through setting the global and backup job media overwrite and protection periods in Backup Exec 2012. Log onto the Backup Exec Central Administration Server. Examine the services. If all or most of them are stop…

856 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question