Solved

Table being Big

Posted on 2004-08-25
8
210 Views
Last Modified: 2010-05-18
My application  collects data from 20 different application. The extracts are in the form of a txt file and is loaded in to temp database ( working database). There are no referential integrity here and data is not indexed. It is validated and cleaned and non validated data are held in suspense. the validated data is posted to a main table in the principle database ( disktrans table) and indexed. the batch is run against this table and data is posted into other tables. Mostly updates and inserts. Then the rows processed are flagged as processed. But the data continues to reside in this table. when the next batch occurs, rows are appended to this table with flag as unprocessed and all unprocessed data are taken for processing. The concern is that this table has 40 million rows. Another table -- disttransdt has 96 million rows. These two are the main table against which all queries are run. Is there any other way to make this better? Any suggested solution.
0
Comment
Question by:jagishiv
8 Comments
 
LVL 11

Expert Comment

by:ram2098
ID: 11899577
I guess you can split your working data to tables one is the working table and other is archive.

You can keep only the working data into the main table and the remaining data (the old data) you can move to an other table called "Archive". (I guess you may not be using the processed data any more, just as a backup).

This way, you can keep the working database size consistent and improved performance.
0
 
LVL 11

Expert Comment

by:ram2098
ID: 11899584
Or even you can think of an other "archive" database too..where you archive the data on a regular basis.
0
 

Author Comment

by:jagishiv
ID: 11899632
Database is archived once a year because people access the info through the year. it is a banking commission application.The data size, I am refering to, is in the main table not the working table.
0
 
LVL 1

Accepted Solution

by:
Parag_Gujarathi earned 250 total points
ID: 11899678
one way of overcoming this problem is that u partition the big table into smaller tables. this will definitely speed up your access, but the downside is that u'll have to modify your application logic, so that it fetches data from the proper table. U may also consider creating partitioned view on the tables.

For further info on partitioning data in the table check the link below...

http://www.sqlteam.com/Item.ASP?ItemID=684

cheers
Parag.
0
How to run any project with ease

Manage projects of all sizes how you want. Great for personal to-do lists, project milestones, team priorities and launch plans.
- Combine task lists, docs, spreadsheets, and chat in one
- View and edit from mobile/offline
- Cut down on emails

 
LVL 34

Assisted Solution

by:arbert
arbert earned 150 total points
ID: 11899700
Agree with Parag, this is a perfect application for partitioning....Your data still gets spread over multiple tables, but all the data is "online" and usable for people to report on...

96million rows isn't very many rows for sql server--have you looked at SQL Profiler and query plans to see if there are any improvements that can be made to the existing indexing/structure?
0
 
LVL 11

Assisted Solution

by:ram2098
ram2098 earned 100 total points
ID: 11899721
Since you have an archival process in place...I guess this should be fine.

Atleast you know what is the maximum limit of your database (no. of rows in an year) and you can tune your queries accordingly. (Like tune your queries, add views etc).

Just an FYI, I have a database of 200 Million rows (Of course ..I keep only three months of data in that) in prod..but managing our queries to run faster (reasonable limit)..with proper indexes etc.

0
 
LVL 34

Expert Comment

by:arbert
ID: 11899727
"Just an FYI, I have a database of 200 Million rows (Of course ..I keep only three months of data in that) in prod..but managing our queries to run faster (reasonable limit)..with proper indexes etc.
"

Yep, I totally agree--I have several large ware houses that don't utilize any partitioning at all.  In fact, partitioning can actaully create some VERY bad plans if it isn't implemented properly.  It really does do a great job from a maintenance perspective though....You'll find much better partitioning and VLDB support in Yukon...
0
 
LVL 11

Expert Comment

by:ram2098
ID: 11899745
"You'll find much better partitioning and VLDB support in Yukon..."

Arbert: Do you know any article explaining these new features..I am actually looking for that..
0

Featured Post

Get up to 2TB FREE CLOUD per backup license!

An exclusive Black Friday offer just for Expert Exchange audience! Buy any of our top-rated backup solutions & get up to 2TB free cloud per system! Perform local & cloud backup in the same step, and restore instantly—anytime, anywhere. Grab this deal now before it disappears!

Join & Write a Comment

Introduction SQL Server Integration Services can read XML files, that’s known by every BI developer.  (If you didn’t, don’t worry, I’m aiming this article at newcomers as well.) But how far can you go?  When does the XML Source component become …
Everyone has problem when going to load data into Data warehouse (EDW). They all need to confirm that data quality is good but they don't no how to proceed. Microsoft has provided new task within SSIS 2008 called "Data Profiler Task". It solve th…
Using examples as well as descriptions, and references to Books Online, show the different Recovery Models available in SQL Server and explain, as well as show how full, differential and transaction log backups are performed
Via a live example, show how to extract insert data into a SQL Server database table using the Import/Export option and Bulk Insert.

759 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

18 Experts available now in Live!

Get 1:1 Help Now