[Last Call] Learn about multicloud storage options and how to improve your company's cloud strategy. Register Now

x
?
Solved

Table being Big

Posted on 2004-08-25
8
Medium Priority
?
220 Views
Last Modified: 2010-05-18
My application  collects data from 20 different application. The extracts are in the form of a txt file and is loaded in to temp database ( working database). There are no referential integrity here and data is not indexed. It is validated and cleaned and non validated data are held in suspense. the validated data is posted to a main table in the principle database ( disktrans table) and indexed. the batch is run against this table and data is posted into other tables. Mostly updates and inserts. Then the rows processed are flagged as processed. But the data continues to reside in this table. when the next batch occurs, rows are appended to this table with flag as unprocessed and all unprocessed data are taken for processing. The concern is that this table has 40 million rows. Another table -- disttransdt has 96 million rows. These two are the main table against which all queries are run. Is there any other way to make this better? Any suggested solution.
0
Comment
Question by:jagishiv
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
8 Comments
 
LVL 11

Expert Comment

by:ram2098
ID: 11899577
I guess you can split your working data to tables one is the working table and other is archive.

You can keep only the working data into the main table and the remaining data (the old data) you can move to an other table called "Archive". (I guess you may not be using the processed data any more, just as a backup).

This way, you can keep the working database size consistent and improved performance.
0
 
LVL 11

Expert Comment

by:ram2098
ID: 11899584
Or even you can think of an other "archive" database too..where you archive the data on a regular basis.
0
 

Author Comment

by:jagishiv
ID: 11899632
Database is archived once a year because people access the info through the year. it is a banking commission application.The data size, I am refering to, is in the main table not the working table.
0
NFR key for Veeam Backup for Microsoft Office 365

Veeam is happy to provide a free NFR license (for 1 year, up to 10 users). This license allows for the non‑production use of Veeam Backup for Microsoft Office 365 in your home lab without any feature limitations.

 
LVL 1

Accepted Solution

by:
Parag_Gujarathi earned 1000 total points
ID: 11899678
one way of overcoming this problem is that u partition the big table into smaller tables. this will definitely speed up your access, but the downside is that u'll have to modify your application logic, so that it fetches data from the proper table. U may also consider creating partitioned view on the tables.

For further info on partitioning data in the table check the link below...

http://www.sqlteam.com/Item.ASP?ItemID=684

cheers
Parag.
0
 
LVL 34

Assisted Solution

by:arbert
arbert earned 600 total points
ID: 11899700
Agree with Parag, this is a perfect application for partitioning....Your data still gets spread over multiple tables, but all the data is "online" and usable for people to report on...

96million rows isn't very many rows for sql server--have you looked at SQL Profiler and query plans to see if there are any improvements that can be made to the existing indexing/structure?
0
 
LVL 11

Assisted Solution

by:ram2098
ram2098 earned 400 total points
ID: 11899721
Since you have an archival process in place...I guess this should be fine.

Atleast you know what is the maximum limit of your database (no. of rows in an year) and you can tune your queries accordingly. (Like tune your queries, add views etc).

Just an FYI, I have a database of 200 Million rows (Of course ..I keep only three months of data in that) in prod..but managing our queries to run faster (reasonable limit)..with proper indexes etc.

0
 
LVL 34

Expert Comment

by:arbert
ID: 11899727
"Just an FYI, I have a database of 200 Million rows (Of course ..I keep only three months of data in that) in prod..but managing our queries to run faster (reasonable limit)..with proper indexes etc.
"

Yep, I totally agree--I have several large ware houses that don't utilize any partitioning at all.  In fact, partitioning can actaully create some VERY bad plans if it isn't implemented properly.  It really does do a great job from a maintenance perspective though....You'll find much better partitioning and VLDB support in Yukon...
0
 
LVL 11

Expert Comment

by:ram2098
ID: 11899745
"You'll find much better partitioning and VLDB support in Yukon..."

Arbert: Do you know any article explaining these new features..I am actually looking for that..
0

Featured Post

NFR key for Veeam Agent for Linux

Veeam is happy to provide a free NFR license for one year.  It allows for the non‑production use and valid for five workstations and two servers. Veeam Agent for Linux is a simple backup tool for your Linux installations, both on‑premises and in the public cloud.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

For both online and offline retail, the cross-channel business is the most recent pattern in the B2C trade space.
It is possible to export the data of a SQL Table in SSMS and generate INSERT statements. It's neatly tucked away in the generate scripts option of a database.
Using examples as well as descriptions, and references to Books Online, show the different Recovery Models available in SQL Server and explain, as well as show how full, differential and transaction log backups are performed
Using examples as well as descriptions, and references to Books Online, show the documentation available for datatypes, explain the available data types and show how data can be passed into and out of variables.

650 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question