Solved

Best practices for Uploading large files

Posted on 2014-11-29
5
440 Views
Last Modified: 2014-12-05
Hi all,

I'm looking for best practices for uploading large files (in the order of Gb).
The application user base is in the order of the thousands and I don't want to block the servers if 100 users need to upload one huge video file.

I have some options already available on my plate from Akamai, Azure blob storage to simple hosted solutions with Nginx or even IIS.

There's no need to act upon the upload and the users are distributed all over the world so the cloud solution is the most logic. The main requirement is to get the files from the users as fast and reliable as possible.

Do you have any experience to share? Pitfalls? Edge-cases to be aware? Successful architectures?

Thanks!
0
Comment
Question by:Alexandre Simões
5 Comments
 
LVL 38

Assisted Solution

by:Rich Rumble
Rich Rumble earned 125 total points
ID: 40472048
You'll want to dedup like Box, DropBox and others do... hash the file before the upload to see if it already exists, then make a copy on the server or use a pointer to the same file. The names can be different, they have no bearing on the hash or file size. The other part you'd want is compression, compress during (before) transmission. There are a variety of ways to do that, but you can probably find some Javascript and utilities that help speed up the transmission. For downloads you use GZ transmission to compress the files before you send them to the users and the browser decompresses them for the user.
-rich
0
 
LVL 61

Accepted Solution

by:
btan earned 250 total points
ID: 40472067
Cloud files, CDN, Storage tiering, on demand secure access and rate throttling areas which I will focus on to consider for such use case. Uploading large file esp video can still be time consuming so some below are some suggestions.

1. In Azure context, differentiate the use of Block and Page Blob. E.g. Choose Block Blob - Mostly for streaming contents. It consumed in blocks, and idea for easier rendering in streaming solutions. And choose Page Blob - Mostly for frequent write needs esp like VM image instances. This Page Blob grant writing to specific part so you do not need to rewrite the whole blob as it is time consuming.

2. Consider creating and maintaining snapshots esp if content does not change much after upload. It does not means it is static per se but the frequent write is lesser. This "snapshotting" can be a way to improve availability too as you assigned it as default Storage Blob. It can then be accessed by all your users performing only read operations, and leaving the original Blob only for writes.

3. CDN as mentioned is good for fast access leveraging the caching aspects, serving the cache content base on nearest node to the originating user request locality. This global access is more attractive as compare to on premise scheme  using accelerator. The latter still helps but CDN will already comes with it too. You can have both for the value add experiences. Overall, CDN reduces latency and increases availability by placing a duplicate of such content. However the costs will increase with this CDN subscription, but you are will not be charged for storage transaction costs for each user to the Blob (if using Azure) since the client is hitting the CDN node and not the storage.

4. Consider a resumable data transfer feature that can allow resumption in event parallel upload take times and network latency can kick in with the small bandwidth that organisation usually have. This can lets user resume upload operations after a communication failure has interrupted the flow of data.

5. Secure Access is also critical so that not all are allowed to access and have all permission. This can indirectly complicate the storage strategy and it is bad security practice if not adhering least privileged principle. There are incident to breaching the cloud storage causing data leakage which we must avoid. Enforce granular access such as owner access key known to authorised user and shared access key for authorised team. The permission can then be set for the granularity via the blob, container, storage etc depending on the provider capability offering. It is best to also consider multi-factor authentication to reach out to allowing true user, as username and password tend to be weak esp user has simple password for convenient. We can set password complexity but it is good to guard minimally for privileged admin to these cloud storage remote administration.

Maybe good to catch this article is a good read for best practice in designing large scale services (including tiering up the storage). It sums up the approach to scale is to: partition the load, and compose it across multiple scale units - be that multiple VMs, databases, storage accounts, cloud services, or data centers.
0
 
LVL 78

Assisted Solution

by:David Johnson, CD, MVP
David Johnson, CD, MVP earned 125 total points
ID: 40472130
Then upload to a cloud provider they have the bandwidth and the disk I/o speed to not block your cloud based server.  Video files are already compressed files unless they are sending raw avi's so gzip will probably had more complexity and for little or no reward. There may even be a penalty if the files are not compressible. Akami is a huge cache but I've received broken files from them in the past usually when I've updated the rss feed before the total content is cached by Akami. or Cachefly. What these 2 do is reduce your bandwidth by using a form of multicasting and the cost savings can really add up when there are a lot of users trying to access the same data at the same time and the files are large.  Geo-redundancy takes a bit of time (minutes not hours) but it is still something to take into consideration. Asia and Australia are the primary problem areas that they deal with extremely well.  Australia is a particular problem since they have limited pipes (getting better over time)..

Data deduplication is only a factor when there is similar data and you want to save your storage costs. If  thousands of  users have Madonna's Ray of Light album then considerable savings are achievable .. This is where the scale of the user base comes into play.. the more users you have the more uses that might have duplicate content being put into a centralized store. End to End Encryption and Encryption at rest makes keeping the users data encrypted at all stages where the user has the only decryption key makes the data pretty much uncompressible and the providers lose this economy of scale for storage as each file (which now really is just a data stream) is unique. as they are encrypted differently.

If you were referring to a local area network that has limited bandwidth and disk iops (in comparison to a cloud provider where 1000 users is just a drop in the ocean. Here 100 users uploading or downloading a huge file will fill your pipe and cause a degradation of service to others within the same LAN.. Now you have to do some traffic shaping to maintain a Quality of Service for everyone.
0
 
LVL 61

Assisted Solution

by:btan
btan earned 250 total points
ID: 40472202
Adding on...

5. Also drilling a bit into the database storage, it is generally not "healthy" to store the file blob in the database table compared to just a path or link index to the place to retrieve the actual file blob. E.g  SQL Server have a FILESTREAM column type that dose the latter. Furthermore, if what you have is an link or file paths, it is more efficient to simply change those paths if the file content is updated or any changes. The processing and resource is less intensive compared to actual entry replacement even if it is by part of the blob which is more computation heavy. Overall, the actual blob stored in the DB table can hinder your DB performance and will not improve file retrieval performance. If needed still, it is still best to store large file blob in a separate table and just keep a foreign key reference to the blob in your main table. Avoid duplicating it.

6. For case of file conversion, if there is a need, Cloud storage may not be the best place to do it offline or is it for user to do it and upload again. It is optimal to have control and consistent experience in such conversion of file via user request in a secure and fast fashion. If interested, you can check out the https://transloadit.com/tour that processes and converts the uploaded files according to your file conversion instructions.
0
 
LVL 30

Author Closing Comment

by:Alexandre Simões
ID: 40482533
Hi guys, sorry for the delay.

Currently we're dealing with a lot of constraints, most of them more political than technical.
The solution we found will use chunk upload whenever allowed by the browser and directly into the file share.
Streaming these files back to the client is a requirement that we'll have to deal later as one of the requirements is to be able to tag the media files (pictures and videos).

For now direct upload to the CDN is not possible because of security constraints... we'll have to revisit this later, specially for the streaming part.

Thank you very much for your inputs,
Cheers!
0

Featured Post

What Should I Do With This Threat Intelligence?

Are you wondering if you actually need threat intelligence? The answer is yes. We explain the basics for creating useful threat intelligence.

Join & Write a Comment

Suggested Solutions

Title # Comments Views Activity
AWS Routing 3 148
CSS Question.. 3 72
Image Orientation On Website 6 44
Error viewing ASP page 12 98
Don’t let your business fall victim to the coming apocalypse – use our Survival Guide for the Fax Apocalypse to identify the risks and signs of zombie fax activities at your business.
Moving applications to the cloud or switching services to cloud-based ones, is a stressful job.  Here's how you can make it easier.
Connecting to an Amazon Linux EC2 Instance from Windows Using PuTTY.
Wufoo.com provides powerful tools for surveying targeted groups, and utilizing data from completed surveys to find trends, discover areas of demand or customer expectation, and make business decisions on products or services.

747 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

10 Experts available now in Live!

Get 1:1 Help Now