Avatar of Richard Korts
Richard Korts
Flag for United States of America asked on

Crazy php script behaivor

See attached php script.

Note on line 16, I am restricting the range variable $cid to between 15000 & 16000.

What I really want to do is not restrict it AT ALL let it run from the first value to the last.

By trial & error, I learned that if I restrict the range to 10000, it works.

If I up it to 2000 (say 15000 to 17000), the script just spins a while and then gives NO result.

The file Customer.txt is a tab delimited file saved from a spreadsheet.

Why can't I do more than 1000 at a time?
chk-miss-cust5.php
PHPMySQL Server

Avatar of undefined
Last Comment
Richard Korts

8/22/2022 - Mon
SOLUTION
Julian Hansen

THIS SOLUTION ONLY AVAILABLE TO MEMBERS.
View this solution by signing up for a free trial.
Members can start a 7-Day free trial and enjoy unlimited access to the platform.
See Pricing Options
Start Free Trial
GET A PERSONALIZED SOLUTION
Ask your own question & get feedback from real experts
Find out why thousands trust the EE community with their toughest problems.
ASKER CERTIFIED SOLUTION
Ray Paseur

THIS SOLUTION ONLY AVAILABLE TO MEMBERS.
View this solution by signing up for a free trial.
Members can start a 7-Day free trial and enjoy unlimited access to the platform.
See Pricing Options
Start Free Trial
GET A PERSONALIZED SOLUTION
Ask your own question & get feedback from real experts
Find out why thousands trust the EE community with their toughest problems.
Richard Korts

ASKER
Ray,

Thanks, I'll try that.

There cannot be any cid duplicates; that is a unique key on that table.
Ray Paseur

OK, let's take a step back... What is the expected output from this script, ie, what are you trying to do here?
Richard Korts

ASKER
Ray,

It's kind of a long story.

In March, the table customer in the mysql database was created basically by uploading ALL the records in customer.txt; there are some axillary tables, one called comments. For reasons not relevant here, that is a separate table on the MySQL database.

Today the customer informed me of an anomaly. There was a record in the comments table for which there is no customer in the customer table.

So my first idea was just read through the .txt file & do a sql query on the customer table & echo the ones not there.

I guessed it would be a handful; I know there are only 2 between customers #'s 15000 and 17000.

FYI, the customer number start at 10000 and go to 23xxx.

Then I would deal with the handful somehow (tbd).
I started with Experts Exchange in 2004 and it's been a mainstay of my professional computing life since. It helped me launch a career as a programmer / Oracle data analyst
William Peck
Julian Hansen

Rather read the CSV into a separate table and then use SQL to do the check - much easier than in code.

You can load the file directly with PHPMyAdmin / Workbench / SQLYog / Heidi or whatever Query tool you are using.

Once imported it is a simple task of

select * from tblcustomer where custid not in (select custid from temp_customer)

Open in new window

hielo

>> Today the customer informed me of an anomaly. There was a record in the comments table for which there is no customer in the customer table.
That implies that these tables are related a common field.   Let's say you have

Customers
=========
cid -- this is the primary key

Comments
=========
id   -- this is the primary field
cid -- this is the "link" between these two tables.

You can try executing:
SELECT A.cid FROM Comments A WHERE NOT EXISTS( SELECT B.cid FROM Customers B WHERE B.cid=A.cid)

You should get the cid values in Comments that don't have a matching/related record in Customers.
Richard Korts

ASKER
The problem was caused by GoDaddy server taking too long to execute; they (GoDaddy) seemed to have fixed the problem, my originally posted query works fine now.

I suspected this because of other issues going on with other parts of our system.

Points awarded for effort.
⚡ FREE TRIAL OFFER
Try out a week of full access for free.
Find out why thousands trust the EE community with their toughest problems.