Link to home
Start Free TrialLog in
Avatar of leobaz2
leobaz2

asked on

Maximum Size for PHP Array

Is there a maximum size that a PHP array can be?  I purchased a product to parse Excel files recently.  This product reads the whole excel file into a PHP array.  However, the excel file I am trying to parse has 18,000 rows and about 20 columns.  For some reason it just crashes in the middle of it.

I have increase the amount of memory for PHP and also the execution time so those shouldn't be a problem.

So I just want to know if there is a limit on the amount of indexes an array in PHP can have.

Thanks
Avatar of KarlPurkhardt
KarlPurkhardt

I dont know how much you can store in a php array but why would you need to store that much data anyway?  Its not very practical to store large amounts of data in an array (not that large anyway), if the data is being stored to be processed, then i'd rethink the way in which you process the data, maybe prcoess it as its being retrieved from the spreadsheet, rather than storing it in an array and then processing it.
I've used arrays with 60000 rows, so that's not the problem.  Odds are it's a memory issue.  Can you figure out the largest file you can use without a crash?

As Karl says, you may want to find some way to read and parse the file a few lines at a time.
Avatar of leobaz2

ASKER

Well, I what I did was I kept deleting rows until I found the limit as to when the parsing worked.  I was able to find that actual character that was making it crash.  If I remove the last character, it works, if I put a character in it fails.  Therefore, I was assuming it was a memory problem.  But then I doubled the memory with the character in there and it still failed.

Is there anything special I have to do to have the memory update to be used?  I restarted apache after changing php.ini.  I also made sure that the php was configured with the "enable-memory-limit" option.
Interesting.  What's the maximum file size that works?

You also might contact whoever sold the product to you and see what they say.
Avatar of leobaz2

ASKER

I am not sure what the max file size is now since I am at work and my other stuff is at home.  I believe it breaks with around a 6-8 MB file and works with a 5MB and lower.

The company I bought this product from was of no help at all.  I opened 2 issues with them and they have no idea and gave up.  Now they are ignoring me.  They mentioned that Microsoft "might" change the way they save the Excel sheet for files larger than a certain size.

The problem is that this Excel parser doesn't use COM since it is supported on Unix platforms also.  So this is pure binary processing which makes it much harder to debug and fix.

ASKER CERTIFIED SOLUTION
Avatar of snoyes_jw
snoyes_jw
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of leobaz2

ASKER

Thanks a lot for the invenstigation.  I will look more at this tonight to see what is going on.
@snoyes

That's probably "fast save" working against you.  MS has the bright idea of not changing the file when you change it, but rather keeping it and appending the changes on.  Which means it reconstructs it all when the file is loaded.

@leobaz
You said when you removed a single character that it could process it, even when the file was much larger but without that character?  Which character?
And is there any off chance you could convince your users to just "Save As..." a CSV file?  That would probably solve your headaches.
arantius:  I don't doubt your superior knowledge of Excel, but if the "fast save" is the case, I wonder why it gets smaller as I delete more lines?  It's only the first deletion where it gets bigger, then progressively smaller as I delete more lines from the same file.