Solved

How can I make this more efficient?

Posted on 2014-09-23
6
125 Views
Last Modified: 2014-09-26
I'm parsing out a JSON file using the following code:

 //This input should be from somewhere else, hard-coded in this example
$file_name = '00_8ptcd6jgjn201311060000_day.json.gz';
// Raising this value may increase performance
$buffer_size = 4096; // read 4kb at a time
$out_file_name = str_replace('.gz', '', $file_name); 
// Open our files (in binary mode)
$file = gzopen($file_name, 'rb');
$out_file = fopen($out_file_name, 'wb'); 
// Keep repeating until the end of the input file
while(!gzeof($file)) {
// Read buffer-size bytes
// Both fwrite and gzread and binary-safe
  fwrite($out_file, gzread($file, $buffer_size));
}  
// Files are done, close files
fclose($out_file);
gzclose($file);

$jsondata = file_get_contents("00_8ptcd6jgjn201311060000_day.json");
$json = json_decode($jsondata, true);
//echo $json
$output = "<ul>";
	foreach($json['id'] as $id) {
	$output .= "<h4>".$id."</h4>";
	$output .="<li>".$id['actor/id']."</li>";
	$output .="<li>".$id['actor/displayName']."</li>";
	$output .="<li>".$id['actor/postedTime']."</li>";
	$output .="<li>".$id['generator/displayName']."</li>";
	$output .="<li>".$id['geo/type']."</li>";
	$output .="<li>".$id['geo/coordinates/0']."</li>";
	$output .="<li>".$id['geo/coordinates/1']."</li>";
	}
$output .="</ul>";
echo $output;

Open in new window


The first part, as far as decompressing the file, works fine. The problem comes when I'm printing the output. I get this:

( ! ) Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 1086054108 bytes) in C:\wamp\www\json\uncompress.php on line 30

How can I process things incrementally so I don't time out?
0
Comment
Question by:brucegust
  • 3
  • 3
6 Comments
 
LVL 109

Accepted Solution

by:
Ray Paseur earned 500 total points
ID: 40340426
What is line 30?  Compare these numbers.  I think we will have to find a way to process this data incrementally.

   134,217,728 - memory limit
1,086,054,108 - requirement
0
 

Author Comment

by:brucegust
ID: 40341512
Morning, Ray!

I agree. The size of the file is 1,060,592 kb so doing things in stages is going to be essential.

The file, by the way, is a decompressed JSON file that I need to parse and then insert into a database. I've got 365 such files to process that way. After it's all done, I plan on writing a script that exports the parsed results from the database to a csv file.

That's the goal for today.

What do you think?
0
 
LVL 109

Expert Comment

by:Ray Paseur
ID: 40341760
PHP may not be the right tool for this, or you may need to get a very, very large server and increase the memory limit to the stratosphere.  What instruction is on line 30?
0
Announcing the Most Valuable Experts of 2016

MVEs are more concerned with the satisfaction of those they help than with the considerable points they can earn. They are the types of people you feel privileged to call colleagues. Join us in honoring this amazing group of Experts.

 

Author Comment

by:brucegust
ID: 40341875
Hey, Ray!

The "instruction" at line 30 is $jsondata=file_get_contents("00_8ptcd6jgjn201311060000_day.json");

Since getting into work this am, I've been trying to figure out how to break the elephant down into bite sized pieces and I've yet to figure it out.

Here's what I've got thus far:

$jsondata = file_get_contents("00_8ptcd6jgjn201311060000_day.json");
//breaking the elephant down into byte sized pieces
$json_size = 4096;
$buffer = fgets($jsondata, $json_size);
$json = json_decode(($buffer), true);
//echo $json
 while (!feof($json))
 {

Open in new window


Problem is, I can't get to "$buffer" because I'm getting hung up $jsondata in light of the file being over 1 GB.

Is there a way to do something like $jsondata=file_get_contents($file_name, $json_size)?

I see how it works with fgets, but how about on the "get_contents" side?
0
 
LVL 109

Expert Comment

by:Ray Paseur
ID: 40341923
Where did that file come from?  Is there a URL that I can read?

PHP functions are all documented in the online man pages.  Example:
http://php.net/manual/en/function.file-get-contents.php
0
 

Author Comment

by:brucegust
ID: 40346141
Ray, here's what I came up with:

$chunk_size=4096;
$url = '00_8ptcd6jgjn201311060000_day.json';
$handle=@fopen($url,'r');
      if(!$handle)
      {
            echo "failed to open JSON file";
      }
while (!feof($handle))
{
$buffer = fgets($handle, $chunk_size);
      if(trim($buffer)!=='')
      {
$obj=json_decode(($buffer), true);
//the rest of my code

It works!

Thanks for your help!
0

Featured Post

Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Foreword (July, 2015) Since I first wrote this article, years ago, a great many more people have begun using the internet.  They are coming online from every part of the globe, learning, reading, shopping and spending money at an ever-increasing ra…
3 proven steps to speed up Magento powered sites. The article focus is on optimizing time to first byte (TTFB), full page caching and configuring server for optimal performance.
Learn how to match and substitute tagged data using PHP regular expressions. Demonstrated on Windows 7, but also applies to other operating systems. Demonstrated technique applies to PHP (all versions) and Firefox, but very similar techniques will w…
The viewer will learn how to count occurrences of each item in an array.

792 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question