Solved

Windows NTFS best file allocate unit size for a Backup-to-Disk Partition

Posted on 2010-11-19
6
2,977 Views
Last Modified: 2012-05-10
I'm configuring a 2TB partition for backup-to-disk, and normally I just take the default file allocate unit size of default, or 4096.  I'm wondering if since I'll have a small number of actual files and the files themselves will be very large, would it help performance if I set the size to be much higher than the default?  If so, would I put it at the highest of 64K?
0
Comment
Question by:jpletcher1
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
  • 2
  • 2
6 Comments
 
LVL 42

Accepted Solution

by:
kevinhsieh earned 63 total points
ID: 34175235
If you have a small number of large files, I would use 64K NTFS allocation blocks.
0
 
LVL 59

Assisted Solution

by:Darius Ghassem
Darius Ghassem earned 62 total points
ID: 34175239
For backup you can run the 64K option but how much performance gain are you really going to get.

http://sqlblogcasts.com/blogs/ssqanet/archive/2008/04/28/sql-server-2005-and-disk-drive-allocation-unit-size-to-64k-any-benefit-or-performance.aspx
0
 

Author Comment

by:jpletcher1
ID: 34175796
That link requires a login so I can't see it.  I'm not sure of the performance statistics, only that everything written by Microsoft claims the higher the size the better the performance.  Can you explain a little about what the article says?
0
NEW Veeam Agent for Microsoft Windows

Backup and recover physical and cloud-based servers and workstations, as well as endpoint devices that belong to remote users. Avoid downtime and data loss quickly and easily for Windows-based physical or public cloud-based workloads!

 
LVL 42

Expert Comment

by:kevinhsieh
ID: 34175807
Just hit cancel when asked to login.
0
 
LVL 59

Expert Comment

by:Darius Ghassem
ID: 34175812
Just hit cancel on the login
0
 

Author Comment

by:jpletcher1
ID: 34175889
As I read this article it seems to promote 64k block size as long as I'm not going to use compression right?  I don't see anything regarding performance so to speak.  I think I'll just go with 64k.  Thanks for the info.
0

Featured Post

Efficient way to get backups off site to Azure

This user guide provides instructions on how to deploy and configure both a StoneFly Scale Out NAS Enterprise Cloud Drive virtual machine and Veeam Cloud Connect in the Microsoft Azure Cloud.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Background Information Recently I have fixed file server permission issues for one of my client. The client has 1800 users and one Windows Server 2008 R2 domain joined file server with 12 TB of data, 250+ shared folders and the folder structure i…
New Windows 7 Installations take days for Windows-Updates to show up and install. This can easily be fixed. I have finally decided to write an article because this seems to get asked several times a day lately. This Article and the Links apply to…
This tutorial will walk an individual through locating and launching the BEUtility application to properly change the service account username and\or password in situation where it may be necessary or where the password has been inadvertently change…
This Micro Tutorial will teach you how to reformat your flash drive. Sometimes your flash drive may have issues carrying files so this will completely restore it to manufacturing settings. Make sure to backup all files before reformatting. This w…

756 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question