[Last Call] Learn how to a build a cloud-first strategyRegister Now

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 610
  • Last Modified:

CRC check with path deep > 256 chars

We have created a tool for filecopy based on robocopy. After the copy job we need to get a CRC check for compare source and destination files.
Robocopy does the copy job without problems with path depths rightly longer than 256 characters.
The following CRC check (self developed with PowerShell and alternativ with VBS) can't read the files in the long paths. The cause seems to be the use of WIN32.API.

Is there a way, as it also makes Robocopy internally, by passing the WIN32.API access files?
Or is there a CRC check tool which can handle long paths?

The used os can be winserver 2003 to winserver 2008r2.
0
hpnix4ever
Asked:
hpnix4ever
  • 4
  • 3
  • 2
6 Solutions
 
DavidCommented:
the crc32 algorithm is well understood, and working open source / public domain subroutines can be found online for any language you desire.

Write the code yourself by incorporating a pubic domain subroutine. Then you know what you have will work.
0
 
CoralonCommented:
My understanding is if you are accessing the paths by the "normal" path names, you will hit the MAXPATH limitation of 260 characters.  If you use the alternate syntax, you should be able use the entire path.  

That alternate syntax is something like \\?\c:\xxxxxx  

Coralon
0
 
hpnix4everAuthor Commented:
Thanks for answer, but this describes not a solution of the real problem. The main problem is, that powershell and VBS uses the WIN32.API and his MAXPATH limitation. The code for CRC check works very fine, but we can't read files in a deep path structure. We are looking for a way to use the NT.DLL interface to r/w files. NT.DLL has not the MAXDEPTH limitation.

Use UNC names doen't solve the probleme. A workaround, but not really practicable, is to split the long path in some parts, they maped to drive.
I'm thinking the only clean solution is native use the NT.DLL for file operations.

Hans
0
NFR key for Veeam Backup for Microsoft Office 365

Veeam is happy to provide a free NFR license (for 1 year, up to 10 users). This license allows for the non‑production use of Veeam Backup for Microsoft Office 365 in your home lab without any feature limitations.

 
CoralonCommented:
I'm not a programmer.. the UNC worked for me in the command prompt to physically delete files and paths with extremely deep paths, which is why I suggested it.

Good luck!

Coralon
0
 
hpnix4everAuthor Commented:
We are using unc path, the problem is not based on this.

Now we have changed the code from Powershell to the good old C++ for use the NT.DLL api native. There is no problem with the deep of paths.

Thanks for answers to all, the problem is now solved.

Regards

Hans
0
 
DavidCommented:
It probably runs faster than before, also ... especially if you profiled the code and ran through the optimization.   The DLL code is never optimized and built for portability, not efficiency.

I wrote a similar application years ago and used intel's optimizing compiler & profiled it.  (Profiling is running special debug version of code with data and letting it create a bunch of log files as it ran to figure out what chunks of code get executed the most, and do branch predictions and such.  You then recompile with this log data as part of the input, run again, repeat, then do final version).

To make long story short, the optimized and re-optimized code ran about 10X faster.
0
 
DavidCommented:
This IS the type of application that would benefit a great deal from profiling-based optimization, especially if you are using a multi core system and a high-end 64bit processor.

If speed IS an issue, please consider benchmarking against the original version (with < 256 byte file names so it runs), then optimizing & reporting results.

Unless bottleneck or disk I/O is the network, then you could easy at least triple performance.
0
 
DavidCommented:
Actually this will be more efficient as the work for each CRC was previously done in a DLL, was it not?  That DLL binary code would not be optimized for the hardware user is running, as it must be written for lowest-common denominator in hardware due to portability.

So the C code IS the fastest & most efficient way to resolve the issue ... IF, that code was compiled with optimization and ideally profiled.
0
 
hpnix4everAuthor Commented:
The solution with a C++ is in this case suboptimal than all the rest of the application is powershell code and should not be translated. For a large count of files must called for each file a separate programm to CRC check. This will be eat the benefit complete.

Regards

Hans
0

Featured Post

Problems using Powershell and Active Directory?

Managing Active Directory does not always have to be complicated.  If you are spending more time trying instead of doing, then it's time to look at something else. For nearly 20 years, AD admins around the world have used one tool for day-to-day AD management: Hyena. Discover why

  • 4
  • 3
  • 2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now