Solved

CRC check with path deep > 256 chars

Posted on 2013-06-05
9
600 Views
Last Modified: 2013-06-17
We have created a tool for filecopy based on robocopy. After the copy job we need to get a CRC check for compare source and destination files.
Robocopy does the copy job without problems with path depths rightly longer than 256 characters.
The following CRC check (self developed with PowerShell and alternativ with VBS) can't read the files in the long paths. The cause seems to be the use of WIN32.API.

Is there a way, as it also makes Robocopy internally, by passing the WIN32.API access files?
Or is there a CRC check tool which can handle long paths?

The used os can be winserver 2003 to winserver 2008r2.
0
Comment
Question by:hpnix4ever
  • 4
  • 3
  • 2
9 Comments
 
LVL 47

Assisted Solution

by:dlethe
dlethe earned 460 total points
ID: 39224166
the crc32 algorithm is well understood, and working open source / public domain subroutines can be found online for any language you desire.

Write the code yourself by incorporating a pubic domain subroutine. Then you know what you have will work.
0
 
LVL 24

Assisted Solution

by:Coralon
Coralon earned 40 total points
ID: 39224428
My understanding is if you are accessing the paths by the "normal" path names, you will hit the MAXPATH limitation of 260 characters.  If you use the alternate syntax, you should be able use the entire path.  

That alternate syntax is something like \\?\c:\xxxxxx  

Coralon
0
 

Author Comment

by:hpnix4ever
ID: 39224786
Thanks for answer, but this describes not a solution of the real problem. The main problem is, that powershell and VBS uses the WIN32.API and his MAXPATH limitation. The code for CRC check works very fine, but we can't read files in a deep path structure. We are looking for a way to use the NT.DLL interface to r/w files. NT.DLL has not the MAXDEPTH limitation.

Use UNC names doen't solve the probleme. A workaround, but not really practicable, is to split the long path in some parts, they maped to drive.
I'm thinking the only clean solution is native use the NT.DLL for file operations.

Hans
0
 
LVL 24

Assisted Solution

by:Coralon
Coralon earned 40 total points
ID: 39227872
I'm not a programmer.. the UNC worked for me in the command prompt to physically delete files and paths with extremely deep paths, which is why I suggested it.

Good luck!

Coralon
0
Control application downtime with dependency maps

Visualize the interdependencies between application components better with Applications Manager's automated application discovery and dependency mapping feature. Resolve performance issues faster by quickly isolating problematic components.

 

Accepted Solution

by:
hpnix4ever earned 0 total points
ID: 39237713
We are using unc path, the problem is not based on this.

Now we have changed the code from Powershell to the good old C++ for use the NT.DLL api native. There is no problem with the deep of paths.

Thanks for answers to all, the problem is now solved.

Regards

Hans
0
 
LVL 47

Assisted Solution

by:dlethe
dlethe earned 460 total points
ID: 39237739
It probably runs faster than before, also ... especially if you profiled the code and ran through the optimization.   The DLL code is never optimized and built for portability, not efficiency.

I wrote a similar application years ago and used intel's optimizing compiler & profiled it.  (Profiling is running special debug version of code with data and letting it create a bunch of log files as it ran to figure out what chunks of code get executed the most, and do branch predictions and such.  You then recompile with this log data as part of the input, run again, repeat, then do final version).

To make long story short, the optimized and re-optimized code ran about 10X faster.
0
 
LVL 47

Assisted Solution

by:dlethe
dlethe earned 460 total points
ID: 39237755
This IS the type of application that would benefit a great deal from profiling-based optimization, especially if you are using a multi core system and a high-end 64bit processor.

If speed IS an issue, please consider benchmarking against the original version (with < 256 byte file names so it runs), then optimizing & reporting results.

Unless bottleneck or disk I/O is the network, then you could easy at least triple performance.
0
 
LVL 47

Expert Comment

by:dlethe
ID: 39241239
Actually this will be more efficient as the work for each CRC was previously done in a DLL, was it not?  That DLL binary code would not be optimized for the hardware user is running, as it must be written for lowest-common denominator in hardware due to portability.

So the C code IS the fastest & most efficient way to resolve the issue ... IF, that code was compiled with optimization and ideally profiled.
0
 

Author Closing Comment

by:hpnix4ever
ID: 39252567
The solution with a C++ is in this case suboptimal than all the rest of the application is powershell code and should not be translated. For a large count of files must called for each file a separate programm to CRC check. This will be eat the benefit complete.

Regards

Hans
0

Featured Post

Netscaler Common Configuration How To guides

If you use NetScaler you will want to see these guides. The NetScaler How To Guides show administrators how to get NetScaler up and configured by providing instructions for common scenarios and some not so common ones.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

A quick step-by-step overview of installing and configuring Carbonite Server Backup.
Know what services you can and cannot, should and should not combine on your server.
This tutorial will show how to push an installation of Backup Exec to an additional server in both 2012 and 2014 versions of the software. Click on the Backup Exec button in the upper left corner. From here, select Installation and Licensing, then I…
This tutorial will walk an individual through locating and launching the BEUtility application and how to execute it on the appropriate database. Log onto the server running the Backup Exec database. In a larger environment, this would generally be …

948 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

19 Experts available now in Live!

Get 1:1 Help Now