How to partition a large table in to a TEMP TABLE to help reduce Stored Proc Processing Times??

I have a stored procedure that uses a sql script that contains a table that has over 20 million rows of data. Processing of this sql script takes longer than it should.

What is the best way to deal with LARGER ROW tables to help reduce SPROC Processing times?

I've tried to pull only a specific date range ( last 2 years, ie. getdate() - 720 ) from this table, but it still takes a long time to pull the data into the temp table.

What the best techniques to use?
LVL 17
MIKESoftware Solutions ConsultantAsked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Scott PletcherSenior DBACommented:
The best way is to first make the sure the table is properly clustered, i.e. that it has the best clustering index key(s) in effect.  This applies to all tables, but especially to large tables.  Typically 20M rows isn't an issue for SQL unless it has to scan the whole thing (perhaps even more than once).
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
MIKESoftware Solutions ConsultantAuthor Commented:
Good point,.. I stand corrected,.. this involves a VIEW, which is a view of a LARGE table.

The deal is, this is a government contract I'm working on which is in relation to very large table that is managed/governed by another National IT location. We are allowed access to the Table via VIEWS.

I cannot change the view.

I can however, pull data from the table using the VIEW and place the needed data in another DB where I can do whatever I want. I can assign indexes, etc...AFTER the data is extracted / moved. But again, that would take time.

Any thoughts?

Is there anything.... regarding SQL SYNTAX or SQL SCRIPTING itself that would help speed up my query?

I've tested things like using WITH clause, instead of temp table,...didn't seem to matter...
0
Scott PletcherSenior DBACommented:
I would have to see the query to tune it.
0
ASPSQLServerCOMCommented:
could it be possible for you to give little bit more detail on what you try to accomplish, like you need data for processing and displaying real time or once in a day you can get data from that remote server (view) and store it in your local db? we really appreciate little more info to assist you
0
PortletPaulfreelancerCommented:
What is the method of access to those views?

The SQL might be very slick already but still run darn slow due to the method of access.
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Microsoft SQL Server

From novice to tech pro — start learning today.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.