Indexing temp tables

Posted on 2003-02-20
Medium Priority
Last Modified: 2007-12-19
create table #Sample as (...)
create index #Sample_SI1 on #Sample (Col1)

select *
from Tab1 T, #Sample S
where T.Col1 = S.Col1

If this is a part of a procedure, can the optimizer take advantage of the index? If you try to force index you will get a message that the index can not be found and that the optimizer will make an alternate decision (obvious because at compile time there is no index). Never the less, it sometime seems as if the optimizer in fact uses the index, because of faster execution (not because of cached data).

1. Are we wrong about this or does Sybase in fact use the index at runtime?

2. What if the index was declared as unique? Will Sybase check against duplicates allthough the index wasn't created when the procedure was compiled?

It seems as if some checks and/or optimizations takes place during runtime - could this explain the first question as well?

Question by:OrjanFranzen
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions

Expert Comment

ID: 7998644
Use following sybase features to find out what is going on.
dbcc traceon(302,310,317)
set showplan,fmtonly on

Expert Comment

ID: 8026929
Yes, it will use the index, if the optimizer thinks the index will be useful. The showplan and dbcc traceon information can give you some hints about what the optimizer thinks is the best option.

Indexes aren't used if the table is less than an extent long, as a table scan is just as fast.  Indexes aren't always terribly useful if you're returning all the columns in the table, since may have to scan anyways to get the nonindexed columns.  In some cases the optimizer will even create a temporary index on it's own if it will be helpful. The exact query and the exact table structures might give me more to go on here.

Often, if you're loading so much into temp tables that you feel you need an index in order to improve performance, you may want to rethink your strategy.  Load the minimum possible number of rows into temp tables, as all those inserts (and I suspect subsequent deletes) can be expensive.  

If this doesn't help, post more details.


Author Comment

ID: 8027939
Thanks Greg!

Still confused, but a bit more hopefull...

Are you sure that the optimizer will take advantage of the index? When I compile the procedure the optimizer is totally unaware of how many rows the table will contain or how the values on indexed columns are spread. Does this mean that the optimizer will create a new execution plan when the procedure is executed including creating valid statistics???

Regarding big volumes and temp tables I'm very well aware of the problem and try to rethink the strategy whenever possible. However, there are situations where you have to join two temp tables, each containing thousands of rows and then taking advantage of suitable indexes is the only possibility. Some of the tables contains millions of rows and are growing, so we are pretty used to handle complex queries and studying execution plans. We frequently use "persistent temp tables" - ordinary tables that are truncated and refilled with data, but this method has some limitations regarding transactions.

Learn how to optimize MySQL for your business need

With the increasing importance of apps & networks in both business & personal interconnections, perfor. has become one of the key metrics of successful communication. This ebook is a hands-on business-case-driven guide to understanding MySQL query parameter tuning & database perf


Expert Comment

ID: 8028159
I've done this before and had the showplan output confirm thatthe indexes were being used.  It was a while ago, at a client that was running ASE 11.0.3, so I don't recall precisely how it was done.  I do remember that we rethought the entire thing and found a better approach that didn't use temp tables in the same way, and that gave us far better performance.

Your comment about statistics makes me recall that we had some issues around this as well.  I believe that the solution might have been to compile the procedure with a dummy temp table already loaded that provided some default statistics.

If you're dealing with million row temp tables, though, you've got a lot of other issues to wade through.  With a lot more details I can wade into this and give you some better answers.  And if your shop has some real heavy lifting that you need help on, I can get some SPS people out there within days.  

Expert Comment

ID: 8037459
Don't forget that you may need to run update stats to get the index used but normally you won't.  Have you created the procedure with recompile set to on?  Otherwise try executing sp_recompile on the temp table after populating it but before using it.

Expert Comment

ID: 8040653
The optimizer will not use the index if it has been created in the same procedure as the temporary table.  When the query plan is compiled it doesn't know of the existence of this table, let alone how many rows it has or what the index is.

To be able to use the index you have
(1) Create the table
(2) Populate the table
(3) Add the index
(4) Call a 2nd stored procedure with the code that you wish to use the index with.

When the 2nd procedures query plan is formed it will know of the existence of the temp table, how many rows it has and of course the index on the table.

Hope this helps

Author Comment

ID: 8041185
Thanks to all!

To Coluac: This is the approach I have found works and in most cases it solves the problem, i e if you don't have to repeatedly create and drop the same temp table in a loop, scanning a main table some thousand rows at a time.

sp_recompile doesn't seem to be 100% trustable - sometimes it works, sometimes not, but only if you use persistent tables, by the reason Coluac described.

Could update statistics, combined with sp_recompile do the trick? The answer seems to be no;

Try this:
create procedure allan with recompile
create table #tt (rad char(1) null)
create index tt_si1 on #tt (rad)
insert #tt (rad) values ('A')
update statistics #tt
select * from #tt (index tt_si1) order by rad

... results in

"Index 'tt_si1' specified as optimizer hint in the FROM clause of table '#tt' does not exist. Optimizer will choose another index instead."

On the other hand we've sometimes got the feeling the the compiler lies..., i e it actually takes advantage of the index allthough the message indicates something else.

Author Comment

ID: 8041288
It seems to be the query analyzer that lies. When I studied the query plan I got the same missleading message, but the statistics IO said something else - who is fooling who..??

The query plan says:
"Index 'tt_si1' specified as optimizer hint in the FROM clause of table '#tt' does not exist.  Optimizer will choose another index instead."

followed by...

"    STEP 1
        The type of query is INSERT.
        The update mode is direct.
        Worktable1 created for ORDER BY.

        FROM TABLE
        Nested iteration.
        Table Scan.
        Forward scan.
        Positioning at start of table.
        Using I/O Size 16 Kbytes for data pages.
        With MRU Buffer Replacement Strategy for data pages.
        TO TABLE
    STEP 2
        The type of query is SELECT.
        This step involves sorting.

        FROM TABLE
        Using GETSORTED
        Table Scan.
        Forward scan.
        Positioning at start of table.
        Using I/O Size 16 Kbytes for data pages.
        With MRU Buffer Replacement Strategy for data pages."

...followed by

"    STEP 1
        The type of query is SELECT.

        FROM TABLE
        Nested iteration.
        Index : tt_si1
        Forward scan.
        Positioning at index start.
        Index contains all needed columns. Base table will not be read.
        Using I/O Size 2 Kbytes for index leaf pages.
        With LRU Buffer Replacement Strategy for index leaf pages."

Could I trust the query analyzer at all? It would be nice if the work table wasn't created and filled with data without beeing used later on, because this would mean unnessesary waist of resources.
LVL 29

Expert Comment

ID: 11381940
No comment has been added to this question in more than 21 days, so it is now classified as abandoned..
I will leave the following recommendation for this question in the Cleanup topic area:


Any objections should be posted here in the next 4 days. After that time, the question will be closed.

EE Cleanup Volunteer

Accepted Solution

modulo earned 0 total points
ID: 11410324
PAQed, with points refunded (100)

Community Support Moderator

Featured Post

Free Tool: Subnet Calculator

The subnet calculator helps you design networks by taking an IP address and network mask and returning information such as network, broadcast address, and host range.

One of a set of tools we're offering as a way of saying thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Hey fellow admins! This time, I have a little fairy tale for you. As many tales do, it starts boring and then gets pretty gory. I hope you like it. TL;DR: It is about an important security matter, you should read it if you run or administer Windows …
Active Directory can easily get cluttered with unused service, user and computer accounts. In this article, I will show you the way I like to implement ADCleanup..
NetCrunch network monitor is a highly extensive platform for network monitoring and alert generation. In this video you'll see a live demo of NetCrunch with most notable features explained in a walk-through manner. You'll also get to know the philos…
Visualize your data even better in Access queries. Given a date and a value, this lesson shows how to compare that value with the previous value, calculate the difference, and display a circle if the value is the same, an up triangle if it increased…
Suggested Courses

771 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question