What is the best practice to partition data from a very large table?

Posted on 2005-04-25
Last Modified: 2010-03-19

We have a very large table with millions of rows. We would like to partition the data based on the canlendar year. What is the best practice and the most efficient way to partition (to archive) those old data into the new partition? Is creating a new database the way to go? If not, could someone shed some lights on this for us?

Question by:mei_liu
    LVL 1

    Expert Comment


    We have a similar situation where we log all the hits to a database and it grows very big and hence on a monthly basis we export the data to a
    delimited file and then zip it and store to conserve space. As we export the traffic data to Webtrends, the only purpose for store the data is for rebuilding the reports. Similarly if you are storing the old data for archiving purposes and do not require the data for operations then I would recommed it to store in a zipped text file.

    If you need old data for operations, I would say that creating a new databse or creating separate table in the same database would both be fine. My preference will be a new database for the following reasons:
    1. New database will allow you to backup and archive separately.
    2. The current databse will be small and easier to manage.
    3. In case you are using a full recovery model then, this will also keep a check on the transaction log.

    Hope this helps...


    Author Comment

    Hi Kailash,

    Thanks for your help. Have you used other methods other than exporting the data as text file, and store it elsewhere?

    1. Have you yourself exported the old data to a new database?
    2. How was your experience like if any?
    3. What is the cause for the newly created database with imported old data to be in such a large size?
    4. How could we control the size of the database with imported old data?

    As we need the old data in production for qurey purpose, we can't take them offline and store them elsewhere. This is the main reason why we have big concern over its size.

    LVL 1

    Accepted Solution

    Hi Mei Liu,

    Yes we do have some old data which we need for operations in in that case we export the old to another database using a dts and it works fine. The only problem you can have is the database you are exporting the data from will be locked if you are doung some sorting while export or doing a delete after export. The time of lock will depend on size of the database.

    I prefer a new database for the old data because I can then set the recovery model of the old database as Simple and thus the size of the database will be small. A Simple recovery model for the old data is fine as the data does not change. This way I can keep the Current databse with a Full recovery model.


    Author Comment

    Thanks Kailash.

    Featured Post

    Looking for New Ways to Advertise?

    Engage with tech pros in our community with native advertising, as a Vendor Expert, and more.

    Join & Write a Comment

    Performance is the key factor for any successful data integration project, knowing the type of transformation that you’re using is the first step on optimizing the SSIS flow performance, by utilizing the correct transformation or the design alternat…
    The Delta outage: 650 cancelled flights, more than 1200 delayed flights, thousands of frustrated customers, tens of millions of dollars in damages – plus untold reputational damage to one of the world’s most trusted airlines. All due to a catastroph…
    Via a live example, show how to shrink a transaction log file down to a reasonable size.
    Viewers will learn how the fundamental information of how to create a table.

    754 members asked questions and received personalized solutions in the past 7 days.

    Join the community of 500,000 technology professionals and ask your questions.

    Join & Ask a Question

    Need Help in Real-Time?

    Connect with top rated Experts

    24 Experts available now in Live!

    Get 1:1 Help Now