need any alternate to curl??

blaaze
blaaze used Ask the Experts™
on
hi

i have these:
>>remote.com/page.php(which accepts 4 parameters, and performs something on curl)

>>current.com/main.php(which has some conditions to check the updates on the datasource.com, when satisfied it extracts VALUES from the dump.html and calls the remote.com/page.php?a=adsf&b=da&c=654&d=erwerzz  , this calling is done by a loop, for each set of values from the dump.html , it calls remote.com/page.php with the respective parameters , uses curl for this)

>>dump.html
a                 b                 c                   d
------------------------------------------------------
asdf            da               654         erwerzz
qewr          gd                789         werqwe
adfa          er                 321         joijoiio
-------------------------------------------------------

>> datasource.com,  which updates rarely

>>we enabled a cron job to run current.com/main.php
which checks for updates in datasource.com , if it gets some updates then it continues to do curl process


QUESTIONS:
1. Does bulk curl calls is allowed and good for hosting
2. what will be cause on bandwidth of the hosting at current.com
3. Is there any good host which supports unlimited curl with no time limit for each curl call
4. Is there any alternative than curl to carry out the process successfully


MY QUESTION MAY BE SOME WHAT LOOK NOOB, BUT I NEED ANSWERS, PLEASE SHARE ANSWERS , DONT POST ANY LINKS

THANKS IN ADVANCE
Comment
Watch Question

Do more with

Expert Office
EXPERT OFFICE® is a registered trademark of EXPERTS EXCHANGE®
Commented:
curl seems ok for REST queries... but consider using memcache instead of the cronjob and html stuff.

In main.php, check if any memcache data is available, send it, or retrieve, store, and send it.



Author

Commented:
i have a huge list in my dump.html
does it take huge bandwidth or it has any issues that may lead to account suspension

Author

Commented:
@flob9 can u tell me for what reason did u suggested me memcache
Become a CompTIA Certified Healthcare IT Tech

This course will help prep you to earn the CompTIA Healthcare IT Technician certification showing that you have the knowledge and skills needed to succeed in installing, managing, and troubleshooting IT systems in medical and clinical settings.

Most Valuable Expert 2011
Top Expert 2016
Commented:
I may not understand all of the question, but I might be able to answer part of it...

"Is there any good host which supports unlimited curl with no time limit for each curl call?" Yes, absolutely.  Get a dedicated machine from Rackspace.  You can control 100% of this.

If you are doing something like this in the URL:
"remote.com/page.php?a=adsf&b=da&c=654&d=erwerzz" - Why not try using file_get_contents() ?

"what will be cause on bandwidth of the hosting at current.com?"
Depends on how much information you retrieve.  The way the question is worded, we cannot tell.

Maybe if you describe the application needs a little more we will be able to help you find a "best practices" solution.  Please try to give us real-world examples so we are not guessing at what the application is trying to accomplish.  Thanks, ~Ray

Commented:
I suggested memcache because i tough the front server was requesting static data several time to another server, and then consume bandwidth.

Memcache was just an idea to limit the cron overhead.

BTW, if the data is huge,  the cron could be better.

Maybe you can stick on the cron, but use rsync to limit bandwidth. Using compression and built-in rsync diff algorithms, you could save a lot of bw.

Author

Commented:
@ray
"what will be cause on bandwidth of the hosting at current.com? "
it means how much does it eat the host's bandwidth, as i am calling curl each time for each set of a,b,c,d from the dump.html , as i checked my each curl call takes around 200Kib of data transfer , and there is a huge list in dump.html , what am i worried about is the variable in phpinfo, memory_limit is 32MB, does it inturn stops the execution at some level??

or even they suspend my account

Author

Commented:
am asking this question because some one told me that one cannot call curl for more than 50 times on a page and there must be atleast gap of 10 minutes between each batch curl calls, coz it takes lot of bandwidth and blablabla
Commented:
32MB memory limit should be ok if your data is 200Kib. If you use more than 32MB, you will get an error like "Fatal Error: PHP Allowed Memory Size Exhausted"

What is your current hosting solution ?

Author

Commented:
hagio host , can u tell me to overcome this error
like is there any current memory usage checking algorithm and to continue the next part of the script in another call to overcome the problem
Commented:
Try to clear the variables holding a lot of data by setting it to null, or reuse it.

Use memory_get_usage() to check if your script is consuming memory while looping.

Author

Commented:
ok what u tell about this
am asking this question because some one told me that one cannot call curl for more than 50 times on a page and there must be atleast gap of 10 minutes between each batch curl calls, coz it takes lot of bandwidth and blablabla

Commented:
It depends of the hoster politics ... you seems to have bandwidth limitation (at least 100Gigs/month), maybe you could ask them about curl and bandwidth usage between servers.

 

Do more with

Expert Office
Submit tech questions to Ask the Experts™ at any time to receive solutions, advice, and new ideas from leading industry professionals.

Start 7-Day Free Trial