Cron jobs and CURL php module

Hi
I have a PHP function that uses CURL. When I load the page in my browser, all code is being executed like it should. However, when the script is called as a cron job, I get a cron result mail with following error:

"This SMS API class can not work without CURL PHP module!"

Is this CURL module not loaded when I run it as a cron job?
This is how I call (all) cronjob pages:

"/usr/bin/php /home/puca/public_html/include/script.php"

(all other scripts work this way, only CURL gives an error..)
LVL 1
puca01Asked:
Who is Participating?
 
moduloConnect With a Mentor Commented:
Closed, 125 points refunded.

modulo
Community Support Moderator
Experts Exchange
0
 
hernst42Commented:
do a /usr/bin/php -m and see if the curl-module is available.
If not maybe you could load it try

<?php
if(dl('curl.so')) { echo "curl loaded";} else {echo "not loadable";}

and watch teh output on the command-line. If it is notavailable you have to recomplie the CLI-php
0
 
puca01Author Commented:
It is not in the module list.
I tried the code and got this warning:

<b>Warning</b>:  dl(): Unable to load dynamic library '/usr/lib/php/extensions/no-debug-non-zts-20020429/curl.so' - /usr/lib/php/extensions/no-debug-non-zts-20020429/curl.so: cannot open shared object file: No such file or directory in <b>/home/puca/public_html/testcurl.php</b> on line <b>3</b><br />
not loadable

I'm not able to recompile because it is a shared hosting. But how come CURL works when I acces the script with a browser?
Can't I just call the script an other way or something like that?
0
Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

 
puca01Author Commented:
Points now 125 for the one that can solve this
(and again: I cannot recompile, it needs to be doen with the CURL currently working for browser acces)
0
 
Yasir_MalikCommented:
Perhaps you can compile the module yourself, store it in your account, and then use the absolute path to the module.  I have no idea if that will work.
0
 
hernst42Commented:
No absolute path will work for dl
file /usr/share/extensions/no-debug-non-zts-20020429//home/xyz/curl.so not found

but you could compile a php on a linux box with the modules you need, upload that binary and then use that binary for your cron-job
e.g /home/puca/bin/php

maybe you can even compile that php on the box for your pages if you have shell-access, so all libraries will match.
0
 
puca01Author Commented:
I'm not an expert in linux so I don't know if I can do that.
Is there anybody who can explain to me why the thing works when i load it in a browser and not with commandline execution?

Or just how i can simulate browser acces with a commandline execution?
0
 
LiquidIce911Commented:
Try other methods of doing PHP cronjobs like:

or other methods include..

/usr/bin/wget http://yoursite.com/page.php

or

lynx --dump http://yoursite.com/cronjob.php >/dev/null
0
 
Yasir_MalikCommented:
If this /usr/share/extensions/no-debug-non-zts-20020429//home/xyz/curl.so won't work, maybe you can do
dl("../../../..home/xyz/curl.so")

But if you have Lynx, you could do as LiquidIce911 suggested.
0
 
puca01Author Commented:
@LiquidIce

got these error messages:
/bin/sh: line 1: /usr/bin/lynx: Permission denied
/bin/sh: line 1: /usr/bin/wget: Permission denied

Is there an easy way to get the lynx binary so I can put it in my own folder and with sufficient rights?

@Yasir

This was just a warning when i ran this line of code:
if(dl('curl.so')) { echo "curl loaded";} else {echo "not loadable";}

So I'm not referring to that path, the server locates it itself.
0
 
petoskey-001Commented:
I've run into this before.  I think I can help.

First run phpinfo(); from a browser. Under configuration / PHP Core, find include_path.  Copy the value listed.

Then at the top of your script add this line...

    ini_set("include_path", "whatever/path/phpinfo/showed;maybe/more/then/one");

When your script runs from the browser, apache is setup and knows where to find everything.  When it's run from the command line you sometimes find that your running a whole different version of php that may not have been configured the same.  If you really want to debug things, you can check the results of phpinfo when run from the browser, compared to when run from cron.

See http://php.net/phpinfo for a user comment showing a small script to email phpinfo back to you.

------
As far as making lynx or wget work, if you have read permission, try this...

cp /usr/bin/lynx .
cp /usr/bin/wget .
chmod +x lynx
chmod +x wget

You now have the binaries in your local directory and ready to execute.  It may not work, depends on the file permissions your admin has setup.
0
 
puca01Author Commented:
I've tried what you said about the include_path, however it doesn't work. I still get the same error:
This SMS API class can not work without CURL PHP module!

I alerady tried copying the binary, but I do not have permissions for that..

I have found a workaround myself: I call a script that does a http request to the other page. It works this way although it's not a professional solution :-)
0
 
arantiusCommented:
As for the reason that this is happening, there are two "versions" of PHP.  One here is compiled into apache, and loads once when apache loads, and is called once for each page loaded.  The other is /usr/bin/php which is loaded each time you call it.  I have to wonder if the hosting company knows what they're doing if the two versions are compiled with different modules.
0
 
VenabiliCommented:
PAQ it - the last comment of the asker is a solution.

Venabili
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.