The request has exceeded the allowable time limit Tag: CFQUERY

I am running a script to upload about 50,000 lines into my DB with Coldfusion and after 2 mintues or so I get this error:

 The request has exceeded the allowable time limit Tag: CFQUERY

How can I fix this ?
LVL 2
ostashenpAsked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

ostashenpAuthor Commented:
The error came around line 300 and during this test I only entered 1000 records but 700 never made it.
Let me know if you need any code examples.
0
NickVdCommented:
I don't know the first thing about coldfusion, but if it's anything like PHP then the scripts are allowed to run for a certain amount of time before being stopped by the system.  In PHP, there is a  set_memory_limit() function that allows the programmer to increase the time allowed (or if used in a loop, to constantly run forever).  There may be something similar in CF.

Hope this helps, or at least leads you in the right direction.
0
ostashenpAuthor Commented:
Ok but here is my next question.

If that is the case how does a programmer work around that.  I have to add about 50,000 entries to my DB a day with some sort of script...Anyone have any ideas.
0
Cloud Class® Course: Certified Penetration Testing

This CPTE Certified Penetration Testing Engineer course covers everything you need to know about becoming a Certified Penetration Testing Engineer. Career Path: Professional roles include Ethical Hackers, Security Consultants, System Administrators, and Chief Security Officers.

NickVdCommented:
Try putting this at the top of your page:

<cfsetting requesttimeout="3600">

Set the time to be whatever you want in seconds.
0
ostashenpAuthor Commented:
That does not work, no matter what value I put in there it still seems to timeout.  
0
NickVdCommented:
Then you will have to figure out how to optimize your query.  A slightly slow query will be a very slow query when run 50,000 times :)
0
ostashenpAuthor Commented:
Thats what I ended up doing, I took a huge insert chunk out of it.
0
dl8453Commented:
Can you break the upload in segments (something similar to pagination)?  You could do a query.recordcount and divide that by some preselected number (50000/200).  The result would be how many "paginations" your loop would need.  

What are you retrieving the 50k records from (flatfile, access database)?  Are you running subqueries?
0
ostashenpAuthor Commented:
It is a text file that I get over a URL.
0
dl8453Commented:
Can you use cfftp to retrieve the file? If so, you can ftp the file directly to the db server and then run the query to insert your data directly into the server via the CFM page and that should run very quickly.

Does the file name change or is it constant?  You may need to ask the peple providing the data to move it to an FTP server.  This sounds like a major undertaking you are doing so I'm sure they will work with you on this.  
0
ostashenpAuthor Commented:
The filename never changes and is always at the same URL. I tried using cfhttp but it failed becuase it was to much info.  So then I would manually FTP it to my server then run the upload but it kept saying the query was to large and it would fail.  Yes this is a HUGE project im stuck in the middle of.
0
dl8453Commented:
I've cfftp'd close to 500mb flatfile (txt format) without a hitch and inserted that data into an SQL table so this is odd.

Is your cfftp (not cfhttp) working?  If you need some help with this, I can post the cfftp routine I use.

Can you paste what two or three records would look like so we can see the format (include what the delimiter is)?  Also, what db are you using?  And what version of MX?  Lastly, do you have any CF code you can post for the insert?  
0
ostashenpAuthor Commented:
It is just a flat text file and the size is about 50 MB and I am using SQL server 2000 with Dreamweaver 8 (But that does not really matter becuase I cannot get it to work interactively with  CF and the database so im basicly using it as a color coded editor.

Here is what 2 lines of the file would look like:

9343|Restaurant Assistant Manager|Houston|Texas|LOVE'S TRAVEL STOPS & COUNTRY STORES|Management|URL HERE ~
3455|Paralegal|New York City|New York|In Confidence|Legal|URL HERE ~


This might be very inefficient BUT I have to make sure the company exists in the company table before the job is added, here is my entire script:

<CFSET crlf = "~">
<CFLOOP INDEX="orec" LIST="#datavar#" DELIMITERS="#crlf#">

<cfif len(trim(orec)) >

<cfquery name="getjobnumberInfo" datasource="test" dbtype="ODBC">
      select max(jid) as NaxNum from Jobs
</cfquery>




<CFSCRIPT>
         Jobnumber = ListGetAt(orec, 1,"|");
         Jobtitle = ListGetAt(orec, 2,"|");
         Location = ListGetAt(orec, 3,"|");
         State = ListGetAt(orec, 4,"|");
         Company = ListGetAt(orec, 5,"|");
         Jobfield2 = ListGetAt(orec, 6,"|");
         Website = ListGetAt(orec, 7,"|");
</CFSCRIPT>
                 
      <cfparam name="CBia" default="N">
      <cfparam name="Experience" default="">
      <cfparam name="Referred" default="">
      <cfparam name="Single" default="J">
      <cfparam name="CBIA" default="N">
      <cfparam name="exp_date" default="">
      <cfparam name="eresumes" default="N">
      <cfparam name="send_resumes" default="0">
      <cfset dateentered = CreateODBCDate(now())>
      <cfset timeentered = CreateODBCtime(now())>
      <cfset Moddate = CreateODBCDate(now())>
      <cfset Modtime = CreateODBCtime(now())>
      <cfset exp_date = CreateODBCdate(dateadd('d',+60,now()))>
      <cfset Temp = Replace(getjobnumberInfo.NaxNum, 'J', '', 'ONE') + 3000 >
      <cfset jobnumber = "J#Temp#" >
      <cfset login = "login">
      <cfset pw = "ocjobs">
      <cfset topusa = 1>
        <cfset CTDue = CreateODBCdate(dateadd('d',+60,now()))>
        <cfset CTPackage = "30">
        <cfset CTRenewed = CreateODBCdate(dateadd('d',+55,now()))>
        <cfset OSJ = "Y">
        <cfset OSR = "N">
        <cfset CBIA = "N">
        <cfset CTR = "N">
        <cfset CTVis = "Y">
        
        
<cfif (state) is "District of Columbia">
<cfset state = "Maryland">
</cfif>
<CFQUERY name="getCounties" DATASOURCE="test" dbtype="odbc">
Select Counties
From Counties
Where Counties like '%#state#%'
</CFQUERY>

<cfquery name="getClients" datasource="test" dbtype="ODBC">
      select Company
      from Clients
      where Company = '#Company#'
</cfquery>


<cfif getClients.recordCount eq 0>

<cfquery name="insertCompany" datasource="ocjobs" dbtype="odbc">
INSERT INTO Clients
(City,Company,CTDue,CTPackage,CTR,CTRenewed,CTVis,DateCreated,DateMod,Login,OSJ,OSR,Password,Referred,State,TimeCreated,TimeMod,Single,CBIA)
VALUES('#Location#','#Company#',#CTDue#,'#CTPackage#','#CTR#',#CTRenewed#,'#CTVis#',#dateentered#,#Moddate#,'#Login#','#OSJ#','#OSR#','#PW#','#Referred#','#State#',#timeentered#,#Modtime#,'#Single#','#CBIA#')
</cfquery>
</cfif>
                 
      <CFQUERY name="insertJobs" DATASOURCE="test" dbtype="odbc">
         INSERT INTO JOBS (CBia,Company,jobnumber,City,State,location,JobTitle,topurl,JobField2,County,login,pw,single,dateentered,timeentered,Moddate,Modtime,eResumes,send_resumes,exp_date,topusa)
         VALUES ('#CBia#','#Company#','#jobnumber#','#location#','#state#','#location#','#JobTitle#','#website#','#jobfield2#','#ValueList(getCounties.Counties)#','#login#','#pw#','#single#',#dateentered#,#timeentered#,#Moddate#,#Modtime#,'#eResumes#','#send_resumes#',#exp_date#,'1')
      </CFQUERY>
   </cfif>
</CFLOOP>
0
dl8453Commented:
As for the version of MX, I meant CF :) sorry about that.  :)

From what I see here, if you have 50k+ records, then you are really banging the heck out of your CF server by running the query inside the loop.  I'm no king of efficiency myself but I cannot imagine many servers can afford that many 'hits.'  You may want to try something like:

<hr />
If you cannot rewrite your query/insert statements, my suggestion would be to:
<li>read the input data (50mb+ flat file)</li>
<li>perform your error checking (e.g. checking that a company exists)</li>
<li>properly format it</li>
<li>write the clean data it to a tempFile on the server</li>
<li>perform your inserts based on the new tempFile</li>
Your read/insert would not need to do any error checking since you did that before the new tempFile was created.  You could take it a step further and set the tempFile name to tempFile_#(dateformat(now(), 'yyyymmdd')#.txt and this could give you a historical data too (just be sure to delete files older than 20 days or something since you get updates daily).  
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
dl8453Commented:
Sorry, left out th equery to try...

INSERT INTO clients
(CBia,Company,jobnumber,City,State,location.......)
            SELECT distinct
            CBia,Company,jobnumber,City,State,location......
                           FROM table

and then repeat for JOBS insert using applicable query.
0
ostashenpAuthor Commented:
Thank you for all of your help I have it working perfectly now !!!
0
dl8453Commented:
I'm glad to hear.  Would you mind sharing your results (remove any proprietary data) if you can?
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Web Applications

From novice to tech pro — start learning today.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.