Solved

PHP exec(), executing a php script, how do do that?

Posted on 2011-03-02
10
598 Views
Last Modified: 2012-05-11
Hello all,

 I was wondering how to execute a php file with php exec() without waiting for any response. I have too many cron jobs. Instead of going to server and adding them manually, I would like to add just 1 file let's call it cron_jobs.php. Of course I am going to have other php scripts like check_status_bot.php, check_files_bot.php.

 I can call any information from database related to bots which ones are going to executed and when, if conditions are matched.

 I can do the rest very easily but I never tried such function like exec() before. What I need with exec() function are;

1. Shouldn't wait for response after executing script (such as if it was successful or not). If it is possible, log the responses to a log file.

2. I am not sure if it is already a feature of exec function or not, I would like to run the files under php / apache or another user. My problem here is if I try to execute a php file which is going to take long time to finish, it shouldn't make my cron_jobs.php file busy, it will just execute, while other script is being executed, cron_jobs.php should process rest of the codes.

3. Is it secure to use? Could this function create any security holes?

 I'm sorry for my english, I might make some mistakes. I just hope, I was able to describe clearly what I really want to achieve. I will be really glad if anyone could help me out with this barrier to continue my project and make my life easier.
0
Comment
Question by:pixalax
  • 5
  • 4
10 Comments
 
LVL 6

Expert Comment

by:hexer4u
Comment Utility
Hi

1. Never heard of exec() NOT returning something, and as such this might not be OK for you. It will always wait for the process to complete, and that's a problem in your case. I sugest cURL instead (other issues arrise)
2.Related to the first
3. As long as you don't exec(user_input) you should be fine. Basically exec() will run a command just like console.

I would suggest the following: have the code in separate files, protected by .htaccess or some other type of protection. Use the cron_job.php file to call cURL to those files, and set the timeout to a verry low value (see params of curl) and set it so that it DOES NOT wait for a response. This way it will run the php file, but not wait for the response, and the rest of the script will continue. The only issue with this is that 1) the user will be "apache" and 2) the files have to be accessible via URL (e.g. http://www..sss/somefile.php) but this shouldn't pose problems if you know how to protect them.

Regards
0
 
LVL 108

Expert Comment

by:Ray Paseur
Comment Utility
I would use CURL POST to start the asynchronous scripts.  You can set a timeout when you set up the CURL parameters.  The script will get started and when the timeout expires, your main script will regain control.  A short timeout, like 1 second, will let you start a lot of scripts very fast.
0
 
LVL 108

Accepted Solution

by:
Ray Paseur earned 500 total points
Comment Utility
See if this code snippet makes sense.  Please post back if you still have questions, ~Ray
<?php // RAY_curl_post_example.php
error_reporting(E_ALL);


// DEMONSTRATE HOW TO USE CURL POST TO START AN ASYNCHRONOUS PROCESS


function curl_post($url, $post_array, $timeout=2, $error_report=FALSE)
{
    // PREPARE THE POST STRING
    $post_string = '';
    foreach ($post_array as $key => $val)
    {
        $post_string .= urlencode($key) . '=' . urlencode($val) . '&';
    }
    $post_string = rtrim($post_string, '&');

    // PREPARE THE CURL CALL
    $curl = curl_init();
    curl_setopt( $curl, CURLOPT_URL,            $url         );
    curl_setopt( $curl, CURLOPT_HEADER,         FALSE        );
    curl_setopt( $curl, CURLOPT_POST,           TRUE         );
    curl_setopt( $curl, CURLOPT_POSTFIELDS,     $post_string );
    curl_setopt( $curl, CURLOPT_TIMEOUT,        $timeout     );
    curl_setopt( $curl, CURLOPT_RETURNTRANSFER, TRUE         );

    // EXECUTE THE CURL CALL
    $htm = curl_exec($curl);
    $err = curl_errno($curl);
    $inf = curl_getinfo($curl);

    // ON FAILURE
    if (!$htm)
    {
        // PROCESS ERRORS HERE
        if ($error_report)
        {
            echo "CURL FAIL: $url TIMEOUT=$timeout, CURL_ERRNO=$err";
            echo "<pre>\n";
            var_dump($inf);
            echo "</pre>\n";
        }
        curl_close($curl);
        return FALSE;
    }

    // ON SUCCESS
    curl_close($curl);
    return $htm;
}


// USAGE EXAMPLE CREATES ASSOCIATIVE ARRAY OF KEY=>VALUE PAIRS
$args["name"]  = 'Ray';
$args["email"] = 'Ray.Paseur@Gmail.com';

// ACTIVATE THIS TO SEE THE ARRAY OF ARGS
// var_dump($args);

// SET THE URL
$url = "http://LAPRBass.com/RAY_bounce_post.php";

// CALL CURL TO POST THE DATA
$htm = curl_post($url, $args, 3, TRUE);

// SHOW WHAT CAME BACK, IF ANYTHING
if ($htm)
{
	echo "<pre>";
	echo htmlentities($htm);
}
else
{
    echo "NO RESPONSE YET FROM $url -- MAYBE BECAUSE IT IS RUNNING ASYNCHRONOUSLY";
}

Open in new window

0
 
LVL 2

Author Comment

by:pixalax
Comment Utility
@Ray_Paseur;
Thank a lot for your help. I'm sorry to make you wait that long. A day after my aunt passed away, so I didn't get chance till now to try.

It is working like charm but I have a question;

If I will set up $timeout = 1 and it will take 10 minutes totally to execute the script will it effect?

Let me give you an example;
Let's say I executeded this script and there are over 30.000 members
http://localhost/AS/crons/members/calculate_payments.php?start=0&end=50.

End of this script it will always redirect to next values ; ?start=50&end=100, ?start=100&end=150 etc till it will calculate all 30.000. Obviously it will take more than a minute to go through all 30.000 members. What will happen if it will take more than 1 second to do all?

1. Script will still execute?
2. What if I don't have any post data? I just want to run "http://localhost/AS/crons/members/unban.php" if there are let's say 10 members so I don't need any post data. Can I simply edit to;

function curl_post($url, $post_array=NULL, $timeout=2, $error_report=FALSE)

if ($post_array != NULL) 
{
// PREPARE THE POST STRING
    $post_string = '';
    foreach ($post_array as $key => $val)
    {
        $post_string .= urlencode($key) . '=' . urlencode($val) . '&';
    }
    $post_string = rtrim($post_string, '&');
}

Open in new window


AND

($post_array != NULL) ? curl_setopt( $curl, CURLOPT_POSTFIELDS,     $post_string ) : '';

Open in new window


Will this still work without any problems?

Thanks a lot for your help and understanding.
0
 
LVL 108

Expert Comment

by:Ray Paseur
Comment Utility
I am sorry to hear of your loss; please accept my sympathy and prayers.

I think you are mostly on firm ground in what you are describing.  Two possible things to consider.  I do not know if these are in play or not, so I will just mention them as a "word to the wise."

The asynchronous script might want to have ignore_user_abort(TRUE).
http://us.php.net/manual/en/function.ignore-user-abort.php

The asynchronous script will not have any browser output, so for your debugging purposes you can use ob_start() to capture any messages from the script.  At the end of the script, you can do something like this to get the messages:

$msg = ob_get_clean();
mail('You@Your.org', 'Async Output', $msg);

Of course if the async script dies for some reason before it sends the message (or if it has a parse error) you will not get output from it, so be careful about that part.

HTH, ~Ray
0
Highfive Gives IT Their Time Back

Highfive is so simple that setting up every meeting room takes just minutes and every employee will be able to start or join a call from any room with ease. Never be called into a meeting just to get it started again. This is how video conferencing should work!

 
LVL 2

Author Comment

by:pixalax
Comment Utility
Hello Ray,

 Thank you for your replies and your prayers. Do I really have to use ignore_user_abort? My idea here is;

1. I have crons/index.php -> This script will be defined in server as cron job. It will execute automatically every minute.
2. What I will do is following;

# Get List Of The Crons
$founds = crons::find_all();

foreach ($founds as $found) {
	# If Cron Never Executed Before
	if ($found->lastRun === NULL) {
		$data = array();
		$data['start'] = 0;
		$data['end'] = 20;
		
		$url = SITE_ROOT.CRONS_DIR.$found->path.$found->name.".php";
		
		$response = curlPost($url,$data,3,TRUE);
		
		if ($response){
			echo "<pre>";
			echo htmlentities($response);
		}
		else{
			echo "NO RESPONSE YET FROM $url -- MAYBE BECAUSE IT IS RUNNING ASYNCHRONOUSLY";
		}
	}
	# If Cron Was Executed Before
	else {
/*
1. Get the date when the script was executed last time
2. Check the cron level from data base to see when it should be executed again (depending on last execution time)
3. If we have to execute the script, execute it.               
	}
}

Open in new window


 As you can see I might have maybe 50 crons here to run ever minute (well realistically 5 to 10 tops every minute). My idea here is to not deal with server's hosting control panel, instead just deal with it once and after that I can add any cron jobs from my admin panel. I will not need any traffic to my website or admin panel in order to execute the scripts.

 Since server is going to execute the crons/index.php, do I really need to use ignore_user_abort ? My only concern here is time out. I just want to execute the script and go on with my coding while it is being executed on server (via apache).
0
 
LVL 108

Expert Comment

by:Ray Paseur
Comment Utility
Do I really have to use ignore_user_abort?
I don't know.  I wrote, might want to have ignore_user_abort(TRUE) because the script that would start it will, in essence, "abort" after starting the async script.  You might want to try it both ways.

As far as having a central cron job run once a minute and start other scripts, that is a reasonable design pattern.  I have used it in the past to create a "dispatcher" that looked for external signals and started scripts to handle the detected conditions.  The part I have never been comfortable about is single-point-of-failure dispatcher mortality -- if the central cron job fails, your system is down.

Best of luck with it, ~Ray
0
 
LVL 2

Author Closing Comment

by:pixalax
Comment Utility
Thank you for your help, time and concern.
0
 
LVL 108

Expert Comment

by:Ray Paseur
Comment Utility
Thanks for the points.  It's a really good question! ~Ray
0
 
LVL 2

Author Comment

by:pixalax
Comment Utility
My pleasure Ray, I wish I could give more point for your quick responses and help.
0

Featured Post

Better Security Awareness With Threat Intelligence

See how one of the leading financial services organizations uses Recorded Future as part of a holistic threat intelligence program to promote security awareness and proactively and efficiently identify threats.

Join & Write a Comment

Author Note: Since this E-E article was originally written, years ago, formal testing has come into common use in the world of PHP.  PHPUnit (http://en.wikipedia.org/wiki/PHPUnit) and similar technologies have enjoyed wide adoption, making it possib…
Developers of all skill levels should learn to use current best practices when developing websites. However many developers, new and old, fall into the trap of using deprecated features because this is what so many tutorials and books tell them to u…
Learn how to match and substitute tagged data using PHP regular expressions. Demonstrated on Windows 7, but also applies to other operating systems. Demonstrated technique applies to PHP (all versions) and Firefox, but very similar techniques will w…
The viewer will learn how to create and use a small PHP class to apply a watermark to an image. This video shows the viewer the setup for the PHP watermark as well as important coding language. Continue to Part 2 to learn the core code used in creat…

744 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

16 Experts available now in Live!

Get 1:1 Help Now