Solved

CRC check with path deep > 256 chars

Posted on 2013-06-05
9
599 Views
Last Modified: 2013-06-17
We have created a tool for filecopy based on robocopy. After the copy job we need to get a CRC check for compare source and destination files.
Robocopy does the copy job without problems with path depths rightly longer than 256 characters.
The following CRC check (self developed with PowerShell and alternativ with VBS) can't read the files in the long paths. The cause seems to be the use of WIN32.API.

Is there a way, as it also makes Robocopy internally, by passing the WIN32.API access files?
Or is there a CRC check tool which can handle long paths?

The used os can be winserver 2003 to winserver 2008r2.
0
Comment
Question by:hpnix4ever
  • 4
  • 3
  • 2
9 Comments
 
LVL 47

Assisted Solution

by:dlethe
dlethe earned 460 total points
ID: 39224166
the crc32 algorithm is well understood, and working open source / public domain subroutines can be found online for any language you desire.

Write the code yourself by incorporating a pubic domain subroutine. Then you know what you have will work.
0
 
LVL 23

Assisted Solution

by:Coralon
Coralon earned 40 total points
ID: 39224428
My understanding is if you are accessing the paths by the "normal" path names, you will hit the MAXPATH limitation of 260 characters.  If you use the alternate syntax, you should be able use the entire path.  

That alternate syntax is something like \\?\c:\xxxxxx  

Coralon
0
 

Author Comment

by:hpnix4ever
ID: 39224786
Thanks for answer, but this describes not a solution of the real problem. The main problem is, that powershell and VBS uses the WIN32.API and his MAXPATH limitation. The code for CRC check works very fine, but we can't read files in a deep path structure. We are looking for a way to use the NT.DLL interface to r/w files. NT.DLL has not the MAXDEPTH limitation.

Use UNC names doen't solve the probleme. A workaround, but not really practicable, is to split the long path in some parts, they maped to drive.
I'm thinking the only clean solution is native use the NT.DLL for file operations.

Hans
0
 
LVL 23

Assisted Solution

by:Coralon
Coralon earned 40 total points
ID: 39227872
I'm not a programmer.. the UNC worked for me in the command prompt to physically delete files and paths with extremely deep paths, which is why I suggested it.

Good luck!

Coralon
0
Don't lose your head updating email signatures!

Do your end users still have the wrong email signature? Do email signature updates bore you or fill you with a sense of dread? You can make this a whole lot easier on yourself by trusting an Exclaimer email signature management solution. Over 50 million users do...so should you!

 

Accepted Solution

by:
hpnix4ever earned 0 total points
ID: 39237713
We are using unc path, the problem is not based on this.

Now we have changed the code from Powershell to the good old C++ for use the NT.DLL api native. There is no problem with the deep of paths.

Thanks for answers to all, the problem is now solved.

Regards

Hans
0
 
LVL 47

Assisted Solution

by:dlethe
dlethe earned 460 total points
ID: 39237739
It probably runs faster than before, also ... especially if you profiled the code and ran through the optimization.   The DLL code is never optimized and built for portability, not efficiency.

I wrote a similar application years ago and used intel's optimizing compiler & profiled it.  (Profiling is running special debug version of code with data and letting it create a bunch of log files as it ran to figure out what chunks of code get executed the most, and do branch predictions and such.  You then recompile with this log data as part of the input, run again, repeat, then do final version).

To make long story short, the optimized and re-optimized code ran about 10X faster.
0
 
LVL 47

Assisted Solution

by:dlethe
dlethe earned 460 total points
ID: 39237755
This IS the type of application that would benefit a great deal from profiling-based optimization, especially if you are using a multi core system and a high-end 64bit processor.

If speed IS an issue, please consider benchmarking against the original version (with < 256 byte file names so it runs), then optimizing & reporting results.

Unless bottleneck or disk I/O is the network, then you could easy at least triple performance.
0
 
LVL 47

Expert Comment

by:dlethe
ID: 39241239
Actually this will be more efficient as the work for each CRC was previously done in a DLL, was it not?  That DLL binary code would not be optimized for the hardware user is running, as it must be written for lowest-common denominator in hardware due to portability.

So the C code IS the fastest & most efficient way to resolve the issue ... IF, that code was compiled with optimization and ideally profiled.
0
 

Author Closing Comment

by:hpnix4ever
ID: 39252567
The solution with a C++ is in this case suboptimal than all the rest of the application is powershell code and should not be translated. For a large count of files must called for each file a separate programm to CRC check. This will be eat the benefit complete.

Regards

Hans
0

Featured Post

Why You Should Analyze Threat Actor TTPs

After years of analyzing threat actor behavior, it’s become clear that at any given time there are specific tactics, techniques, and procedures (TTPs) that are particularly prevalent. By analyzing and understanding these TTPs, you can dramatically enhance your security program.

Join & Write a Comment

A quick step-by-step overview of installing and configuring Carbonite Server Backup.
You might have come across a situation when you have Exchange 2013 server in two different sites (Production and DR). After adding the Database copy in ECP console it displays Database copy status unknown for the DR exchange server. Issue is strange…
This tutorial will walk an individual through the steps necessary to configure their installation of BackupExec 2012 to use network shared disk space. Verify that the path to the shared storage is valid and that data can be written to that location:…
This tutorial will give a short introduction and overview of Backup Exec 2012 and how to navigate and perform basic functions. Click on the Backup Exec button in the upper left corner. From here, are global settings for the application such as conne…

746 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

9 Experts available now in Live!

Get 1:1 Help Now