Link to home
Start Free TrialLog in
Avatar of kaufmed
kaufmedFlag for United States of America

asked on

Return First Matching Query

I have a set of business rules that are prioritized. These rules specify how a customer is located in the database (by various search criteria). The first matching rule wins. Within a stored procedure, how can I basically do something like this:

--Return this if there are rows
SELECT * FROM table WHERE firstCriteria = "firstValue"

-- Else Return this if there are rows
SELECT * FROM table WHERE secondCriteria = "secondValue"

-- Else Return this if there are rows
SELECT * FROM table WHERE thirdCriteria = "thirdValue"

-- etc.

Open in new window


Due to how are database is structured, I have determined that it will end up being the most efficient to break the queries out into separate ones rather than have a compound WHERE clause.

Now, as I am returning this to .NET code, I actually need to open a cursor in order to have .NET pick up the data. I think this might be where I'm getting thrown (mentally). I believe I can do a SELECT COUNT(1) for each query to check for rows before opening the cursor, but I thought that approach might be cumbersome, messy, slow, or all. I'm trying to write as clean of a query as possible (but willing to make sacrifices if necessary).

I'm afraid my PL/SQL skills are less than padawan level, so I turn to the Oracle gods. I'm not necessarily looking for the complete solution, but a nudge in the right direction would be most appreciated.

Thanks!
SOLUTION
Avatar of slightwv (䄆 Netminder)
slightwv (䄆 Netminder)

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
ASKER CERTIFIED SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Use dataReader. When you do DataReader.Read() it will give you next row, if row is not there result will be false.

It will be kind of like this

DataReader dr = Sql.ExccuteQuery()

While (dr.read())
{
var field1 = dr["field]
}
Avatar of kaufmed

ASKER

@slightwv & sdstuber

Won't your approach execute every query? I'm only trying to execute as a previous query fails to match (probably wasn't clear from the OP).

The indexes we have on the tables aren't playing nice with these requirements. I'm attempting to exhaust all manner of query manipulation before I go back to the DBAs to request a new index be added.

@VjSoft
I know how to read the data in .NET. That is not my question. Thank you for your input, though.
Avatar of slightwv (䄆 Netminder)
slightwv (䄆 Netminder)

I'll let sdstuber continue.  His approach is much cleaner than mine.
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of kaufmed

ASKER

OK, I understand what you are saying. I realize now that I oversimplified the question. But I'll see if I can make the case statement work within my environment. If not, I'll start a new question with a more accurate description.

You've answered the question that I asked, so...

Thanks!
I think I understood your requirement, but it's unfortunately not one that's easy to satisfy with good performance for all cases.

The single query with multiple conditions is "best" in terms of number of times you have to query the table.

However, it might not be good for performance if you have indexes that satisfy each of the 3 conditions and might possibly be able to very quickly read a row and validate.  Especially if the first condition would usually be hit.

If you did want to force a check (and you could do it efficiently) then I might try something like this...

Simply execute each query with a rownum=1 clause because you only need to check for existence, not actual values.  Then open your cursor


DECLARE
    v_check INTEGER;
    v_cur   SYS_REFCURSOR;
BEGIN
    SELECT COUNT(*)
      INTO v_check
      FROM yourtable
     WHERE firstcriteria = 'firstValue' AND ROWNUM = 1;

    --Return this if there are rows
    IF v_check = 1
    THEN
        OPEN v_cur FOR
            SELECT *
              FROM yourtable
             WHERE firstcriteria = 'firstValue';
    ELSE
        SELECT COUNT(*)
          INTO v_check
          FROM yourtable
         WHERE secondcriteria = 'secondValue' AND ROWNUM = 1;

        --Return this if there are rows
        IF v_check = 1
        THEN
            -- Else Return this if there are rows
            OPEN v_cur FOR
                SELECT *
                  FROM yourtable
                 WHERE secondcriteria = 'secondValue';
        ELSE
            SELECT COUNT(*)
              INTO v_check
              FROM yourtable
             WHERE thirdcriteria = 'thirdValue' AND ROWNUM = 1;

            --Return this if there are rows
            IF v_check = 1
            THEN
                -- Else Return this if there are rows
                OPEN v_cur FOR
                    SELECT *
                      FROM yourtable
                     WHERE thirdcriteria = 'thirdValue';
            END IF;
        END IF;
    END IF;
END;

Open in new window

Avatar of kaufmed

ASKER

Yeah, I started off with everything in one query, but there is one rule that relies on a substring (last 5 of SSN) of an indexed column (negating the index). The DBAs are OK with creating a new index if it is needed, but they want to do it in a manner that scales. Today we are working with last 5 of SSN; tomorrow we might need to go to 6. My goal was to whittle down the rows to a small enough number that the substring would be inconsequential. My rules aren't giving me enough columns to work with in order to achieve this smaller subset. This is why I thought i might run each query as needed. Then the hit might not be as bad.

The oversimplification I mentioned above is due to join tables, which unfortunately are not the same tables across all rules.

As it stands right now, my latest incarnation is running at a worst-case of about two minutes for a common search criteria (e.g. last name 'Smith'). Less common criteria seems to run a bit snappier. In the web world, though, 2 minutes is an eternity!