Link to home
Create AccountLog in
Avatar of Jacob Leis
Jacob Leis

asked on

Combining Queries

I would like to limit connections to the database by combining queries.  What is the best method to go about doing so?

Below is one of my blocks of queries that I would like to consolidate:
<cfquery result="insertPerson" datasource="gm">
    INSERT INTO person (first, last, entity_name)
    VALUES ('#getDonations.firstname#', '#getDonations.lastname#', '#getDonations.organization#')
</cfquery>
<cfset local.personKey = insertPerson["GENERATED_KEY"] />
<!--- get state_id --->
<cfquery name="getState" datasource="gm">
    SELECT  id
    FROM    ref_state
    WHERE   abbr = '#getDonations.state#'
</cfquery>
<!--- insert address1, city, state_id, country_id, postalcode into address --->
<cfquery result="insertAddress" datasource="gm">
    INSERT INTO address (address1, city, state_id, postalcode1)
    VALUES ('#getDonations.address1#', '#getDonations.city#', '#getState.id#', '#getDonations.postalcode#')
</cfquery>
<!--- insert into person_addresses --->
<cfquery result="insertAddress" datasource="gm">
    INSERT INTO person_address (person_id, address_id, type_id)
    VALUES (#local.personKey#, #insertAddress["GENERATED_KEY"]#, 1)
</cfquery>
<cfset local.personKey = insertPerson["GENERATED_KEY"] />
<!--- to get origin_id for person_orgin  --->
<cfif NOT isDefined('origin')
        OR isDefined('origin') AND structCount(origin) NEQ 2
        OR isDefined('origin.name') AND origin.name NEQ uCase(stateIndex)>
    <cfquery name="origin" datasource="gm">
        SELECT  id, name
        FROM    ref_origin
        WHERE   name = '#uCase(stateIndex)#'
    </cfquery>
</cfif>
<!--- insert into person_origin - need origin_id --->
<cfquery result="getPersonOrigin" datasource="gm">
    SELECT  id
    FROM    person_origin
    WHERE   person_id = #local.personKey#
        AND origin_id = #origin.id#
</cfquery>
<cfif NOT getPersonOrigin.recordCount>
    <cfquery result="insertPersonOrigin" datasource="gm">
        INSERT INTO person_origin (person_id, origin_id)
        VALUES (#local.personKey#, #origin.id#)
    </cfquery>
</cfif>
<!--- insert into ref_global - need campaign_id --->
<cftry>
    <cfquery result="insertGlobal" datasource="gm">
        INSERT INTO ref_global (person_id, campaign_id)
        VALUES (#local.personKey#, #getDonations.campaign_id#)
    </cfquery>
    <cfcatch type="database">
        <!--- continue process - data already exists --->
    </cfcatch>
</cftry>
 <cfquery result="insertTag" datasource="gm">
     INSERT INTO person_tag (person_id, tag_id)
     VALUES (#local.personKey#, 6)
 </cfquery>

Open in new window

Avatar of _agx_
_agx_
Flag of United States of America image

EDIT: If your concern is the number of connections, the best method is to wrap the code in a cftransaction, which should be done with related inserts anyway. Aside from the standard ACID benefits, it also ensures CF keeps the same connection for all of the queries.  Though I think CF keeps the same db connection for the entire request anyway. So ultimately it may not make any difference aside from guaranteeing that happens.

Assuming CF keeps the same connection, consolidating the statements into fewer cfquery tags probably won't gain much. Also, while you can execute multiple statements within a single query tag, there are some disadvantages:

1. Unlike a stored proc, a cfquery can only return a single result set
2. Executing multiple statements in MySQL requires enabling the setting allowMultiQueries=true flag. That makes all queries vulnerable to SQL injection. So if you ever do enable that setting, be sure to update ALL queries to use cfqueryparam. Otherwise, they're now at risk for injection.
>> '#getDonations.firstname#'

Actually, I see the code seems to be inserting data from another query.  Assuming that data is pulled from another table in the same db, have you considered using a stored procedure instead? That would only require a single db call.  Just pass in the necessary id's, and use a series of INSERT INTO / SELECT ... FROM TableName statements to transfer the data into the target tables.
Avatar of Jacob Leis
Jacob Leis

ASKER

My apologies for the delay in my response.  I have not been able to get cftransaction to work thus far in Lucee.  As for using a stored procedure, the data is coming from a database on server 'A' and being moved to a database on server 'B', so as far as I am aware, that is not a possibility.

For now, I will try to implement some suggestions from your first comment.  Thank you _agx_.
EDIT: No worries. Weird, simple transactions should work in Lucee same as in Adobe CF, unless maybe there's different db's/db servers involved? If you want to post a test case, I'd be happy to take a look later.  Though even without a transaction, I'd be surprised if Lucee didn't maintain the same connection for the whole request by default (like Adobe CF).  The transaction is more to ensure data integrity: all statements succeed or fail as a unit.

>> so as far as I am aware, that is not a possibility.
Correct.  My idea won't work if the source and target on different db servers.  I've heard of a few tools that might support this kind of cross db server transfer in MySQL, but haven't used any of them personally.
The root issue I am attempting to overcome is that the scheduled task I have built is timing out when there are more than five donors.  The task will run nightly, and we anticipate a high of up to thirty donors over a 24 hour period.  At this point, I am considering calling the scheduled task for one record, then relocating to itself until all of the un-processed donors are processed.  Edit: Ultimately, it would be a loop that would end when the getDonations query returns no records.
ASKER CERTIFIED SOLUTION
Avatar of _agx_
_agx_
Flag of United States of America image

Link to home
membership
Create a free account to see this answer
Signing up is free and takes 30 seconds. No credit card required.
See answer
One of the first things I did was to increase the requestTimeout. Unfortunately, that was a fruitless endeavor.  I agree with your last comment as being the best solution(s).  I am going to try a batch processing method.  Long term, I will likely dump everything into a temp table on the db server that we are pushing the data to and process it from there.  Thanks again _agx_.