Link to home
Start Free TrialLog in
Avatar of m3developer2
m3developer2Flag for United States of America

asked on

How can I write the best SQL query to process a WHERE IN list of 10,000 Strings

The logic:
I am trying to query a DB2 Products table for a specific list of about 10,000 item #'s. The list of item's comes from another database. I am doing this in java, so the SQL query gets built dynamically, to connect to and query the DB2. But the WHERE IN list is too long and throws an error.

The error: [SQL0101] SQL statement too long or complex.

My query statement:

 SELECT m.MMITNO        
  FROM ItemMaster m
 WHERE m.mmitno IN ('AAZ4FTTBL+', 'AAZ4FTTBL+', 'AAZSHE99307', 'ABC100STOCKPOTA', 'ABC110001', 'ABC110002')

except add about 10,000 for item #'s in that WHERE list...

How do I query that many items in SQL?


Thank you!
Avatar of Kent Olsen
Kent Olsen
Flag of United States of America image

Hi M3,

That's quite a filter that you're trying to build...

The limit on the query length has nothing to do with the amount of work that the DBMS will do.  It's a matter of practicality.

The IN predicate is a shortcut way to write "WHERE VarA ='a' or 'b' or 'c' or 'd'....  When the query is parsed, it is converted into as many individual tests as necessary to test for all of the possible acceptable values.  The filter will loop through the list searching for a match for every possible value that was not excluded by other filters.

If you can write the values to another file, even a temporary one, and join the this table to the results that you're selecting, you could see an improvement.  Especially if you can index the table.

Is there a connection between your DB2 system and the database with the list of values?  That would most certainly provide an easy an efficient solution.


Kent
Can you load your 10,000 items into a temp table (whatever the DB2 equivalent of that is) and then

SELECT m.MMITNO        
  FROM ItemMaster m
 WHERE m.mmitno IN (select itemno from temp_table)
Avatar of Sean Stuber
Sean Stuber

load your values into a temporary table

 SELECT m.MMITNO        
  FROM ItemMaster m
 WHERE m.mmitno IN (select your_column from your_temp_table);


your temp table might look something like this...
note, you will only create this table once, not once per user or per session, just once period.

CREATE GLOBAL TEMPORARY TABLE your_temp_table
   ( your_column varchar(10))
ON COMMIT PRESERVE ROWS
NOT LOGGED ON ROLLBACK DELETE ROWS;
Avatar of m3developer2

ASKER

Kent,
The two databases are on the same network, although different subnets (which shouldn't matter). The source db is a mySQL database, and the DB2 is an iSeries v6R1 for i. I understand the goal of uploading to a temp table. I have done it manually using iNavigator where I put the .csv file on the iSeries filesystem to do the bulk upload. But via jdbc, and a java SQL statement running on a seperate server, could I upload a csv thru the SQL code? Or am I way off?
ASKER CERTIFIED SOLUTION
Avatar of Kent Olsen
Kent Olsen
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Kent,
That is straight forward enough ! Thank you!

So, to clarify, our mySQL database is what serves our ecommerce platform. So, I'm building a product syncing tool to keep products synced up. It resides on a seperate Linux server elsewhere, but THANKS for that article. Fascinating!

On a further stretch, is there a SQL bulk upload statement for the i, that I could 'dump' the mysql table into a .csv file and read and upload it to the 'i' table?
I've requested that this question be closed as follows:

Accepted answer: 0 points for m3developer2's comment #a40302270

for the following reason:

Solution was straight forward along with other suggestions too...
Body
I think Kent definitely helped m3developer2 develop his solution, so m3developer2 should give Kent the points. It's only fair.
Hi Dave,

I don't know the iSeries well enough to know if it can read and load an entire csv file.  I'm sure that it can, but I don't know the tools.

On the other platforms,

  IMPORT FROM filename OF DEL INSERT INTO mytable (column_name);

but that's probably no help here.
The "Copy From Import File" command (CpyFrmImpF) will most assuredly load a CSV-file (located in an IFS folder) into a database table.

e.g.
CPYFRMIMPF FROMSTMF('/MyFolder/CONDETL.TXT')
       TOFILE(mylib/condetl) RCDDLM(*CRLF)

Open in new window


HTH,
DaveSlash
I initially hit the wrong link to award points. THis format has changed! Anyway, thank you Kent for the answer that worked for this scenario! Thank you everyone else for replying as well!