Solved

What is the fastest / lossless  way to transfer files to a remote location ?

Posted on 2011-02-25
7
394 Views
Last Modified: 2012-06-27
Hi,
    I'm trying to put together a flow chart for a doctors office and there problems are transferring images which are 50MB -100MB in size to there remote locations ( 3 locations) without any loss of quality (software compression). right now there transfers are taking forever and when patients are requesting the images its a very long wait.
0
Comment
Question by:Computers4me
7 Comments
 
LVL 3

Expert Comment

by:sergiobg57
ID: 34985664
Well, one thing stands for a fact:
You cannot move it faster than your bandwidth allows you.

You can of course use some compression.
If you use typical image compressions which will allow them to see the image right after download it without decompressing.. you'll lose on quality.
If you compress it in a way they'll have to decompress after, then they'll have to have the third party compressor.

If it goes directly to their PC, so it's not a good approach but if the image goes to your business' computer, so you can decompress it before sending in to them.

You must make the choice between loss of quality and loss of flexibility.
To workaround the flexibility issue, you can use known compressors like gzip.

Tell me which loss you prefer and i'll tell you the better software to do the trick.
0
 

Author Comment

by:Computers4me
ID: 34985675
id prefer loss of flexibility
0
 
LVL 83

Expert Comment

by:Dave Baldwin
ID: 34985697
Here's a 'bandwidth calculator' for you: http://www.ibeast.com/content/tools/band-calc.asp  You can plugin the file size and see how it will take at different rates.   If you have consumer DSL, it can take from 10 to 55 minutes to transfer a 50MB file.  On DSL, the usual limit is the upload speed which is usually much lower than the download speed.
0
Has Powershell sent you back into the Stone Age?

If managing Active Directory using Windows Powershell® is making you feel like you stepped back in time, you are not alone.  For nearly 20 years, AD admins around the world have used one tool for day-to-day AD management: Hyena. Discover why.

 
LVL 3

Expert Comment

by:sergiobg57
ID: 34985719
Well, use KGB then.
It's known by it's strong algorithm which was able to compress the whole windows installation dvd, into less than 50MB.
But believe me, if you choose high compression rates, it will take SOME time to compress and decompress.

http://downloads.sourceforge.net/project/kgbarchiver/kgbarchiver/v1.2.1.24/kgb_arch_win_gui_v1.2.1.24.exe?r=http%3A%2F%2Fwww.softpedia.com%2Fdyn-postdownload.php%3Fp%3D31575%26t%3D0%26i%3D2&ts=1298661291&use_mirror=ufpr

Open in new window

0
 

Author Comment

by:Computers4me
ID: 34985756
will that time take longer then the time it takes to transfer the data without compression?

and also the Upload and download at the location where the problems most drastic are 2 mb up and 2 mb down speeds.

    I was thinking about setting up an RDP session so the doctor can show his patient the images without having to transfer it first. the only problem then is if the patient wants a print out then they would have to transfer the image(s).
0
 
LVL 2

Accepted Solution

by:
BITCooler earned 500 total points
ID: 34985794
In the short run you may have no choice but to use the compression tool suggested by sergiobg57.

HOWever, you should really discuss the real issue with the physicians which is to invest in a solution for the long term.   Especially if file transfers of this size are the norm.   Plus with HIPPA compliance, you may be at risk if you DON'T have the proper file transfer security protection.

Suggestions:

1) higher "Synchronous" bandwidth is needed at all three offices.

2) Good security appliances at each office to establish site to site VPNs would allow you to construct a secure tunnel for site to site file transfers.  (this you may already have, but just lack the badnwidth horsepower to reduce wait time.)

3) If you are transferring medical files over the open internet between offices, I highly suggest a Secure Managed File Transfer System with non-repudiation capability that meets, or exceeds HIPPA compliance standards.

I can suggest a great managed file transfer system,  128 bit or higher upload / download encryption with 256-bit encryption when files are at rest on the server and it's FIPS-2 compliant.  It's called MoveIT DMZ by IPSwitch. http://www.ipswitchft.com/products/moveitdmz/

It's not cheap.

Anyway food for thought.
0
 
LVL 42

Expert Comment

by:kevinhsieh
ID: 34986231
You may be able to replicate all of the data files to all of the offices. DFS-R available in Windows 2003 R2 and better does this quite well. I assume that it's okay that it may take a while for a file to get created in 1 location to become locally available at another location. What isn't know is whether or not if you replicated everything to every location if the amount of new data generated on a daily basis swamps the total available bandwidth for those same 24 hours. It isn't very efficient to replicate everything everywhere if each file has a 1% chance of being needed at the remote location. That is very simple operationally if you can do it, because there is no need to pre-determine what data gets stored locally in which office.

My understanding is that medical imaging data essentially doesn't compress and doesn't deduplicate. The only thing you can do to compress is to employ a lossy compression routine similar to what JPEG and MP3 compression does at a loss of some quality.
0

Featured Post

Portable, direct connect server access

The ATEN CV211 connects a laptop directly to any server allowing you instant access to perform data maintenance and local operations, for quick troubleshooting, updating, service and repair.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

In this article, I am going to show you how to simulate a multi-site Lab environment on a single Hyper-V host. I use this method successfully in my own lab to simulate three fully routed global AD Sites on a Windows 10 Hyper-V host.
This article will inform Clients about common and important expectations from the freelancers (Experts) who are looking at your Gig.
This tutorial will walk an individual through the steps necessary to enable the VMware\Hyper-V licensed feature of Backup Exec 2012. In addition, how to add a VMware server and configure a backup job. The first step is to acquire the necessary licen…
This tutorial will show how to configure a new Backup Exec 2012 server and move an existing database to that server with the use of the BEUtility. Install Backup Exec 2012 on the new server and apply all of the latest hotfixes and service packs. The…

713 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question