• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 515
  • Last Modified:

What is the fastest / lossless way to transfer files to a remote location ?

    I'm trying to put together a flow chart for a doctors office and there problems are transferring images which are 50MB -100MB in size to there remote locations ( 3 locations) without any loss of quality (software compression). right now there transfers are taking forever and when patients are requesting the images its a very long wait.
1 Solution
Well, one thing stands for a fact:
You cannot move it faster than your bandwidth allows you.

You can of course use some compression.
If you use typical image compressions which will allow them to see the image right after download it without decompressing.. you'll lose on quality.
If you compress it in a way they'll have to decompress after, then they'll have to have the third party compressor.

If it goes directly to their PC, so it's not a good approach but if the image goes to your business' computer, so you can decompress it before sending in to them.

You must make the choice between loss of quality and loss of flexibility.
To workaround the flexibility issue, you can use known compressors like gzip.

Tell me which loss you prefer and i'll tell you the better software to do the trick.
Computers4meAuthor Commented:
id prefer loss of flexibility
Dave BaldwinFixer of ProblemsCommented:
Here's a 'bandwidth calculator' for you: http://www.ibeast.com/content/tools/band-calc.asp  You can plugin the file size and see how it will take at different rates.   If you have consumer DSL, it can take from 10 to 55 minutes to transfer a 50MB file.  On DSL, the usual limit is the upload speed which is usually much lower than the download speed.
Improve Your Query Performance Tuning

In this FREE six-day email course, you'll learn from Janis Griffin, Database Performance Evangelist. She'll teach 12 steps that you can use to optimize your queries as much as possible and see measurable results in your work. Get started today!

Well, use KGB then.
It's known by it's strong algorithm which was able to compress the whole windows installation dvd, into less than 50MB.
But believe me, if you choose high compression rates, it will take SOME time to compress and decompress.


Open in new window

Computers4meAuthor Commented:
will that time take longer then the time it takes to transfer the data without compression?

and also the Upload and download at the location where the problems most drastic are 2 mb up and 2 mb down speeds.

    I was thinking about setting up an RDP session so the doctor can show his patient the images without having to transfer it first. the only problem then is if the patient wants a print out then they would have to transfer the image(s).
In the short run you may have no choice but to use the compression tool suggested by sergiobg57.

HOWever, you should really discuss the real issue with the physicians which is to invest in a solution for the long term.   Especially if file transfers of this size are the norm.   Plus with HIPPA compliance, you may be at risk if you DON'T have the proper file transfer security protection.


1) higher "Synchronous" bandwidth is needed at all three offices.

2) Good security appliances at each office to establish site to site VPNs would allow you to construct a secure tunnel for site to site file transfers.  (this you may already have, but just lack the badnwidth horsepower to reduce wait time.)

3) If you are transferring medical files over the open internet between offices, I highly suggest a Secure Managed File Transfer System with non-repudiation capability that meets, or exceeds HIPPA compliance standards.

I can suggest a great managed file transfer system,  128 bit or higher upload / download encryption with 256-bit encryption when files are at rest on the server and it's FIPS-2 compliant.  It's called MoveIT DMZ by IPSwitch. http://www.ipswitchft.com/products/moveitdmz/

It's not cheap.

Anyway food for thought.
You may be able to replicate all of the data files to all of the offices. DFS-R available in Windows 2003 R2 and better does this quite well. I assume that it's okay that it may take a while for a file to get created in 1 location to become locally available at another location. What isn't know is whether or not if you replicated everything to every location if the amount of new data generated on a daily basis swamps the total available bandwidth for those same 24 hours. It isn't very efficient to replicate everything everywhere if each file has a 1% chance of being needed at the remote location. That is very simple operationally if you can do it, because there is no need to pre-determine what data gets stored locally in which office.

My understanding is that medical imaging data essentially doesn't compress and doesn't deduplicate. The only thing you can do to compress is to employ a lossy compression routine similar to what JPEG and MP3 compression does at a loss of some quality.
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Upgrade your Question Security!

Your question, your audience. Choose who sees your identity—and your question—with question security.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now