Solved

WebPagiination_RecordsFilters.

Posted on 2011-09-23
10
252 Views
Last Modified: 2012-06-21

If I have a catalog table with 100,000 rows, and I am displaying 500 rows per page using the pagination query.

If I need to display a dropdown list for all pages with data (word column) describing the data like this, so when user selects that page it displays the other rows

Page 1 (Row1 aaa - Row 499 bbb)
Page 2 (Row 500 ccc - Row 999 fff)
Page 3 (Row 1000 word - Row 1500 word)

How do you implement this (sql or plsql or dedicated table refreshed nightly)?
I dont think something like this would be efficient.

select word from (SUBQUERY for RESULTSET) where rownum in (1,499,500,999,....)

I know you dont like this but it is a must requirement for one application.
0
Comment
Question by:sam15
  • 5
  • 4
10 Comments
 
LVL 73

Expert Comment

by:sdstuber
Comment Utility
select * from
(select x.*, rownum rn from
(select * from your table
order by some_column) x
where rownum <=  :your_upper_limit)
where rn >= :your_lower_limit
0
 
LVL 76

Expert Comment

by:slightwv (䄆 Netminder)
Comment Utility
Why not the row_number() over(order by some_column) and remove one nested select?
0
 
LVL 73

Expert Comment

by:sdstuber
Comment Utility
row_number will force full evaluation of the entire recordset

using the nested queries with ROWNUM will allow your execution plan to introduce a STOP KEY operation
0
 

Author Comment

by:sam15
Comment Utility
I already have the pagination query. I think you misunderstood me.

The think is i need to display a dropdownlist with all the pages and the word range from those records on each page. So when user selects (Page 11 yaaa - zaaa) it will navigate those records.

For ths dropdown list i would only need to obtain only the records 1, 499, 500, 999,1000, 1499, 1500, etc. of the result set. Is there a fast way to do this in sql.

My thinking is this must be a full table scan (subquery) and then somhow selecting those rows using a incremetning variable but there might be a better way.
0
 
LVL 73

Accepted Solution

by:
sdstuber earned 500 total points
Comment Utility
oh yuck!

in that case do something like the pagination but use an inlist instead of a range to select the individual rows you want.


select * from
(select x.*, rownum rn from
(select * from your table
order by some_column) x
where rownum <=  :your_upper_limit)
where rn in (1,499,500,999,1000,1499,1500)


if you don't know how many there will be then try something like this...

select * from
(select x.*, rownum rn from
(select * from your table
order by some_column) x
where rownum <=  :your_upper_limit)
where rn = 1 or mod(rn,500) in (0,499)

0
Top 6 Sources for Identifying Threat Actor TTPs

Understanding your enemy is essential. These six sources will help you identify the most popular threat actor tactics, techniques, and procedures (TTPs).

 

Author Comment

by:sam15
Comment Utility
well it is a dynamic table (records added weekly) so do I need to coune records first?

But I am thinking how your query would retrieve those selected records. Let us say the count was 100,000.

select * from
(select x.*, rownum rn from
(select * from your table
order by some_column) x
where rownum <=  100000)
where rn = 1 or mod(rn,500) in (0,499)

I have not tested this yet but would this give me ROW1, ROW499, Row500, ROW999, ROWS1000, etc..

I assume also you are doing a FULL TABLE SCAN and there is no way around it? right
0
 
LVL 73

Expert Comment

by:sdstuber
Comment Utility
it might do an index scan based on the order by clause

depends on how many blocks in the table the query will have to visit to find the rows.

if you have a billion rows,  and "some_column" is relatively clustered, then it's possible finding the first 100000 rows will be a relatively lightweight index lookup

otherwise, it'll probably do a table scan
0
 

Author Comment

by:sam15
Comment Utility
I did a quick test using one table column with 100,000 rows. see attached text file for results.
It took 1 second. It might take more with actual table.

Do you think it might be better to create an MV that stores this list and refresh it nightly. That would be much quicker if the table does not change much every day.
testT5.txt
0
 
LVL 73

Expert Comment

by:sdstuber
Comment Utility
If you have the storage, and your data is relatively static, then sure, an MV sounds like a reasonable idea for performance improvement.
0
 

Author Comment

by:sam15
Comment Utility
sdstuber: you deserve a promotion to super genius Rank now.

thanks,
0

Featured Post

How your wiki can always stay up-to-date

Quip doubles as a “living” wiki and a project management tool that evolves with your organization. As you finish projects in Quip, the work remains, easily accessible to all team members, new and old.
- Increase transparency
- Onboard new hires faster
- Access from mobile/offline

Join & Write a Comment

PL/SQL can be a very powerful tool for working directly with database tables. Being able to loop will allow you to perform more complex operations, but can be a little tricky to write correctly. This article will provide examples of basic loops alon…
"In order to have an organized way for empathy mapping, we rely on a psychological model and trying to model it in a simple way, so we will split the board to three section for each persona and a scenario and try to see what those personas would Do,…
This video shows syntax for various backup options while discussing how the different basic backup types work.  It explains how to take full backups, incremental level 0 backups, incremental level 1 backups in both differential and cumulative mode a…
The viewer will learn how to look for a specific file type in a local or remote server directory using PHP.

762 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

12 Experts available now in Live!

Get 1:1 Help Now