Reclaiming space in public folder database

Our public folder database is growing out of control, but I'm not sure what is taking up all of the space.  Online defragmentation runs every night but only completes a full pass about every 8 days.  Online maintenance does not overlap with the backup schedule, but is not able to finish in one night due to the size of the public folder database which is currently at 203GB.  I ran a powershell script today to export the totalitemsize and totaldeleteditemsize of each public folder in the database and that total came to around 64GB.  The last time maintenance completed a couple days ago the 1221 event log stated that there was only 2732MB free.  Where is the remaining 139GB?  Any suggestions?
npdodgeAsked:
Who is Participating?
 
Manpreet SIngh KhatraConnect With a Mentor Solutions Architect, Project LeadCommented:
It requires 110% of the Consumed DB space
/p with a /d makes sense as it will create another copy with Offline Defrag and keep on as it is :)
I assume around10-12GB per hours but very much depends on your server performance as well
So i would say catch 200GB and 20 hrs as thats the .edb size and we need to plan for extra time and not assume anything
eseutil /ms - this will to give us more idea .... i would have hit the Limit value to 0 on the PF database to ensure i can get a much white space

- Rancy
0
 
Exchange_GeekCommented:
Let online maintenance run 24x7 and await 1221 event id to complete.

You're on the right track by viewing data Totalitemsize and totaldeleteditemsize. Precisely this is what I'll ask you to run.

Get-PublicFolderStatistics -Server “servername” | Select-Object admindisplayname, creationtime, lastmodificationtime, LastUserAccessTime, Itemcount, Totalitemsize, ServerName, DatabaseName | Export-Csv C:\publicfolderstats.csv

If you get the data matching close to 175 Gigs, thats good - if not, I'll repeat - let Online Maintenance run 24x7 again.

Dismount and mount PF Database once if need be.

Regards,
Exchange_Geek
0
 
npdodgeAuthor Commented:
Will I even see event 1221 if online maintenance runs 24x7?  Where did u come up with 175gb?  What is the point of gathering item count, create time, access time, etc?  From what I've been reading, I should dismount the public folder db and run eseutil /ms to perform a space dump?
0
Creating Active Directory Users from a Text File

If your organization has a need to mass-create AD user accounts, watch this video to see how its done without the need for scripting or other unnecessary complexities.

 
Manpreet SIngh KhatraSolutions Architect, Project LeadCommented:
I ran a powershell script today to export the totalitemsize and totaldeleteditemsize of each public folder in the database and that total came to around 64GB - Look the data you see if just a free space in the Database and you will get this in Online Maintenance (Event 1221) only after it has crossed the Retention period limit mentioned in the PF Database :)

So either you can push the value to 0 and let the Online Maintenance run for a day and check the event 1221 or else wait for a could of weeks till all data falls under the limit mentioned and give you what you want :(

- Rancy
0
 
npdodgeAuthor Commented:
Rancy and Expert_Geek,

I should have probably mentioned that the TotalItemSize came to 64,982MB and the TotalDeletedItemSize was only 153MB so I don't think online maintenance is going to get me anywhere.
0
 
Manpreet SIngh KhatraSolutions Architect, Project LeadCommented:
So your saying "TotalItemSize came to 64,982MB" and PF database is 203GB as per your initial info ... where is the other 140GB ?

Hope your getting a complete list of all PF's size.

- Rancy
0
 
npdodgeAuthor Commented:
Exactly.  When I ran the powershell script I used @{ expression={$_.totalitemsize.value.toMB()}} so it would output it in MB.  I also opened the folder size properties in Outlook of each top-level folder in the public folder databases and got a similar total.
0
 
Manpreet SIngh KhatraSolutions Architect, Project LeadCommented:
So how come the PF database is 203GB and the Output from the shell shows 65GB ..... humm a bit confused

- Rancy
0
 
npdodgeAuthor Commented:
That's why I'm posting this question.  :)  I'm confused too.  Maybe a space dump will give me more insight as to what it taking up all the space.  I'm hoping to run that tonight.
0
 
Manpreet SIngh KhatraSolutions Architect, Project LeadCommented:
0
 
npdodgeAuthor Commented:
This article applies to Exchange 2010 SP2.  I am running Exchange 2007 SP3.  I should have stated that in my original question, sorry.
0
 
Manpreet SIngh KhatraSolutions Architect, Project LeadCommented:
0
 
npdodgeAuthor Commented:
Can you assist me with that script in exporting that data to a txt file.  I've tried a few options and I'm not getting the output that I would expect.  Also, I don't see how this script is any different than what I was running:

get-publicfolderstatistics | ft name, @{ expression={$_.totalitemsize.value.tokb()}}, @{ expression={$_.totaldeleteditemsize.value.tokb()}} > c:\<output file>

I ran my powershell command again but this time in KB to get a more accurate size since MB is will round it down or up.  It did change the value of totalitemsize to 89,338,452KB or roughly 85GB.  Totaldeleteditemsize was 202,438KB or roughly 198MB.  These are more accurate to the numbers I was getting in Outlook because they are displayed in KB.  I just ballparked the MB when I was documenting the numbers in Outlook without actually doing the math.  I'm sure if I ran this again in bytes my totalitemsize would be even higher but we probably still have over 100GB unaccounted for.
0
 
Manpreet SIngh KhatraSolutions Architect, Project LeadCommented:
Get-Publicfolderstatistics | ft name, totalitemsize, totaldeleteditemsize > C:\Output.csv

Once you get the data try to Sum both columns and check whats the Total

- Rancy
0
 
npdodgeAuthor Commented:
Yeah, I actually started running that as soon as I sent my last post.  When you output the sizes in bytes, the totalitemsize is 91,929,172,081 bytes or 87670.5MB.  The totaldeleteditemsize is 216,856,103 bytes or 206.8MB.
0
 
Manpreet SIngh KhatraSolutions Architect, Project LeadCommented:
Thats only around 85GB .... where is the rest going ?
Is it possible for you to try with Offline Defrag ... hopee you know how much time and space is required ?

- Rancy
0
 
npdodgeAuthor Commented:
I won't be able to do that until the weekend.  I believe you need 1.5 times the amount of free space as the database and I have plenty of free space on the same volume.  I would also use the /p option to preserve the original DB just in case.  I also recall that it takes roughly an hour per 8GB.   So if we do end up with an 85GB DB then I'm only looking at 10-11 hours.  Worst case scenario if it doesn't decrease then I'm looking around 26 hours.  My users won't be too happy but at least it's just the public folder database and only a few would actually have a need for it over the weekend.  I'm planning to dismount the DB tonight and run eseutil /ms to get a space dump.
0
 
npdodgeAuthor Commented:
Do you have an idea on how long it would take to produce the space dump on this public folder database if I were to dismount it this evening?  Are we talking 5-10 minutes?  What limit are you referring to, retention limit?  Right now we are keeping deleted items in the PF for 14 days, as you can see, it isn't taking up much space.
0
 
Manpreet SIngh KhatraSolutions Architect, Project LeadCommented:
The PF database is 203GB as you said and the ms command runs around the same speed so will surely take long as it will have to go through the entire Tree structure of the Database

Retention Limit is the limit on the Database that says for how long the data will remain in Dumpster before its available for the White space - This is the 14 days your referring to :)

- Rancy
0
 
npdodgeAuthor Commented:
If you're saying that it could take 20 hours to run a space dump then I might as well perform the offline defrag first and hope for the best.  

i would have hit the Limit value to 0
I was just confirming if you wanted me to change my retention limit to 0.  Not sure if this is necessary since 14 days is not consuming much.
0
 
Manpreet SIngh KhatraSolutions Architect, Project LeadCommented:
I agree on running the Offline defrag as even if it says 10GB we are planning to run so why waste that time ?

If possible i would say create a Copy and run the Offline defrag as if any issues like taking to much time or something we will have a copy of it as sometimes we see its taking too much time and we kill or cancel the process as its locked the file possible chances of Corruption :(

The reason i wanted to change the value to 0 and let one Online Maintenance run is as we are running the Offline Defrag why not get the most out of it as we cant run it every month or so ... forget about weekly :(

- Rancy
0
 
npdodgeAuthor Commented:
If possible i would say create a Copy and run the Offline defrag as if any issues like taking to much time or something we will have a copy of it as sometimes we see its taking too much time and we kill or cancel the process as its locked the file possible chances of Corruption :(

That's a great idea.  We're planning to dismount the PF DB, make a copy, mount the DB again and run an offline defrag on the copy.  That way there is no downtime and we'll be able to tell if the offline defrag actually reclaims all that space.  If it does, then I'll do it again next weekend and actually leave the PF DB dismounted to prevent users from adding emails to it.
0
 
Manpreet SIngh KhatraSolutions Architect, Project LeadCommented:
If it does, then I'll do it again next weekend and actually leave the PF DB dismounted to prevent users from adding emails to it - Makes absolute sense :)

Working with first phase of testing huh .... I like that :)

- Rancy
0
 
npdodgeAuthor Commented:
I ran the offline defrag on a copy, it took less than 6 hours to complete, but it only cleared up around 24Gb of space.  The DB went from 204Gb to 180Gb.  I then ran eseutil /ms against the new database that was created after the offline defrag.  It looks like there are a lot of search folders within the public folder database and it seems that I can easily clean these up with a registry entry according to this article:  
http://blogs.technet.com/b/dblanch/archive/2009/04/24/tracking-down-exchange-2007-database-bloat.aspx
Not sure how much this would help, the article mentions that I can calculate the white space by taking the number at the end of the dump and muliplying that by 8.  Which in my case looks like it would only free up around 8Gb (1105602x8) / 1024 = 8638Mb.

I've attached the dump file for anyone to take a look and offer some additional advice.
msoutput.zip
0
 
npdodgeAuthor Commented:
Any suggestions?  It doesn't look like you can have search folders in public folders so I'm wondering what all of the tables that begin with S-1 are and if I should just set that registry key to delete these search folder entries if that's what they really are.
0
 
Manpreet SIngh KhatraSolutions Architect, Project LeadCommented:
Yeah there isnt much white space.

And i dont see whats taking or consuming the space. As the PF is quiet Huge we even cant try to take a PST backup and recreate the PF database and work.

- Rancy
0
 
npdodgeAuthor Commented:
I added the "reset views" registry entry for the public folder database over the weekend and I now have 42660 Mb of free space after online maintenance completed.  That's nice but I still have a lot of space unaccounted for.  The search continues...
0
 
Manpreet SIngh KhatraSolutions Architect, Project LeadCommented:
Wow approx 40GB i guess .... humm let me review if there is some more System PF using the space.

- Rancy
0
 
npdodgeAuthor Commented:
Well, that didn't last long.  It took only one businessday for all of that space to be used again by search folders.  Resetting the registry freed the space up again but it was quickly consumed again the next day.  At least I can now account for 30GB of the space.  I have about 10GB free right now.
0
 
Manpreet SIngh KhatraSolutions Architect, Project LeadCommented:
Thats frustating why is the PF Space only 65 but Database 200+ GB.

I would have preferred to Repair and the Offline Defrag the DB but at this time and Space it could take a couple of days or more for this entire activity :(

- Rancy
0
 
npdodgeAuthor Commented:
Rancy,

The Offline defrag I performed two weeks ago on the copied DB took less than 6 hours but only reduced the DB to 180GB.
0
 
Manpreet SIngh KhatraSolutions Architect, Project LeadCommented:
Yeah i was speaking of Repair .... eseutil /p (It will fix any Corruption but could take a long time and isnt sure if it will fix the issue as its not like normal Mailbox database where we can move data and remove the old Database so a lot of limited options :(

- Rancy
0
 
npdodgeAuthor Commented:
Well, I could always run a repair and defrag on a copy again to give me an idea of how long it will take and what the outcome will be.
0
 
Manpreet SIngh KhatraSolutions Architect, Project LeadCommented:
So make a copy and lets run over with it and let the Production work till that time :)

Will await your response with details

- Rancy
0
 
Exchange_GeekCommented:
Geez this post went on and on and isn't going anywhere - repair *may* help I'd agree. But if you are working with a huge white space, you are heading towards disaster with such a big database.

My best bet - be ready to export data to PST with bulk of top level folders. This will take you're entire weekend - may be more.

But when you export data and then work towards re-importing it, you will have fresh database to work with and have taken a PST backup taken.

That's your last bet.

Regards,
Exchange_Geek
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.