Solved

Table being Big

Posted on 2004-08-25
8
212 Views
Last Modified: 2010-05-18
My application  collects data from 20 different application. The extracts are in the form of a txt file and is loaded in to temp database ( working database). There are no referential integrity here and data is not indexed. It is validated and cleaned and non validated data are held in suspense. the validated data is posted to a main table in the principle database ( disktrans table) and indexed. the batch is run against this table and data is posted into other tables. Mostly updates and inserts. Then the rows processed are flagged as processed. But the data continues to reside in this table. when the next batch occurs, rows are appended to this table with flag as unprocessed and all unprocessed data are taken for processing. The concern is that this table has 40 million rows. Another table -- disttransdt has 96 million rows. These two are the main table against which all queries are run. Is there any other way to make this better? Any suggested solution.
0
Comment
Question by:jagishiv
8 Comments
 
LVL 11

Expert Comment

by:ram2098
ID: 11899577
I guess you can split your working data to tables one is the working table and other is archive.

You can keep only the working data into the main table and the remaining data (the old data) you can move to an other table called "Archive". (I guess you may not be using the processed data any more, just as a backup).

This way, you can keep the working database size consistent and improved performance.
0
 
LVL 11

Expert Comment

by:ram2098
ID: 11899584
Or even you can think of an other "archive" database too..where you archive the data on a regular basis.
0
 

Author Comment

by:jagishiv
ID: 11899632
Database is archived once a year because people access the info through the year. it is a banking commission application.The data size, I am refering to, is in the main table not the working table.
0
What is SQL Server and how does it work?

The purpose of this paper is to provide you background on SQL Server. It’s your self-study guide for learning fundamentals. It includes both the history of SQL and its technical basics. Concepts and definitions will form the solid foundation of your future DBA expertise.

 
LVL 1

Accepted Solution

by:
Parag_Gujarathi earned 250 total points
ID: 11899678
one way of overcoming this problem is that u partition the big table into smaller tables. this will definitely speed up your access, but the downside is that u'll have to modify your application logic, so that it fetches data from the proper table. U may also consider creating partitioned view on the tables.

For further info on partitioning data in the table check the link below...

http://www.sqlteam.com/Item.ASP?ItemID=684

cheers
Parag.
0
 
LVL 34

Assisted Solution

by:arbert
arbert earned 150 total points
ID: 11899700
Agree with Parag, this is a perfect application for partitioning....Your data still gets spread over multiple tables, but all the data is "online" and usable for people to report on...

96million rows isn't very many rows for sql server--have you looked at SQL Profiler and query plans to see if there are any improvements that can be made to the existing indexing/structure?
0
 
LVL 11

Assisted Solution

by:ram2098
ram2098 earned 100 total points
ID: 11899721
Since you have an archival process in place...I guess this should be fine.

Atleast you know what is the maximum limit of your database (no. of rows in an year) and you can tune your queries accordingly. (Like tune your queries, add views etc).

Just an FYI, I have a database of 200 Million rows (Of course ..I keep only three months of data in that) in prod..but managing our queries to run faster (reasonable limit)..with proper indexes etc.

0
 
LVL 34

Expert Comment

by:arbert
ID: 11899727
"Just an FYI, I have a database of 200 Million rows (Of course ..I keep only three months of data in that) in prod..but managing our queries to run faster (reasonable limit)..with proper indexes etc.
"

Yep, I totally agree--I have several large ware houses that don't utilize any partitioning at all.  In fact, partitioning can actaully create some VERY bad plans if it isn't implemented properly.  It really does do a great job from a maintenance perspective though....You'll find much better partitioning and VLDB support in Yukon...
0
 
LVL 11

Expert Comment

by:ram2098
ID: 11899745
"You'll find much better partitioning and VLDB support in Yukon..."

Arbert: Do you know any article explaining these new features..I am actually looking for that..
0

Featured Post

Simplifying Server Workload Migrations

This use case outlines the migration challenges that organizations face and how the Acronis AnyData Engine supports physical-to-physical (P2P), physical-to-virtual (P2V), virtual to physical (V2P), and cross-virtual (V2V) migration scenarios to address these challenges.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Title # Comments Views Activity
sql server concatenate fields 10 35
SQL Update trigger 5 18
SSIS with VPN COnnection 2 76
Sql Server group by 10 27
When you hear the word proxy, you may become apprehensive. This article will help you to understand Proxy and when it is useful. Let's talk Proxy for SQL Server. (Not in terms of Internet access.) Typically, you'll run into this type of problem w…
Why is this different from all of the other step by step guides?  Because I make a living as a DBA and not as a writer and I lived through this experience. Defining the name: When I talk to people they say different names on this subject stuff l…
Via a live example, show how to extract insert data into a SQL Server database table using the Import/Export option and Bulk Insert.
Viewers will learn how to use the INSERT statement to insert data into their tables. It will also introduce the NULL statement, to show them what happens when no value is giving for any given column.

773 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question