Unstoppable Domains โ€” AI Assistant

Executing large queries in php

Spaceship Spaceship
Watch
Impact
19
hey
I am making a script for my website which will let people add their website to over 100 toplists with a click of a button, and have one concern:
This would slow my website down, use up alot of cpu power..
so I want to know whats the best way of running large queries? adding delay etc..
how is it done in php?
I will be using the CURL function to send data to topsites and want 1 submission, 2-3 seconds rest...2nd submission ...etc
so the submission page will look like :

Code:
Submitting to XYZ.....[delay]....Added/Failed
Submitting to ABC.....[delay]...Added/Failed
and so on
or is their anyother way of doing this?
thanks
 
0
•••
The views expressed on this page by users and staff are their own, not those of NamePros.
.US domains.US domains
This is just going to use a bunch of power since you are going to be using curl a ton of times. Try using this code in a new document.

ob_start();

for($i=0;$i<70;$i++)
{
echo 'printing...<br />';
ob_flush();
flush();

usleep(300000);
}

You will notice it has a little bit of a pause, instead of the script waiting for the entire page to be loaded first. You could run something similar to that (without the printing and make it a function) after each site that it tries to submit to.
 
0
•••
umm i tried this and it just loads for along time...and then prints the entire thing at once
 
0
•••
id say load it picewise using ajax so it doesnt cut into you initial pageload time. try the prototype js framework. put your functions in a different script and load it there with a loading data dialogue, that way the processing is controlled from the clientside. another way is to put it in a backend queue and process it when http clients are below a certain threshold

Hope this works
 
0
•••
I would have the commands placed into a text file or database. Tell the user it is done, but really they are in queue to be processed. A cron script then runs the commands in the database/flatfile at the time period of your choosing. This way you still use a lot of resources, however your user is not delayed and your cpu can be pushed during your slowest time of day instead of instantly.
 
0
•••
This is just going to use a bunch of power since you are going to be using curl a ton of times. Try using this code in a new document.

ob_start();

for($i=0;$i<70;$i++)
{
echo 'printing...<br />';
ob_flush();
flush();

usleep(300000);
}

You will notice it has a little bit of a pause, instead of the script waiting for the entire page to be loaded first. You could run something similar to that (without the printing and make it a function) after each site that it tries to submit to.

This would cause high memory usage. PHP does not fully clear memory it is using once it has finished with it. As the script continues through the loop it builds up more and mor memory.

For most people this would not be a problem as the script only stays open for meer milliseconds it has little impact but with your suggestion the connection could stay open for minutes or even longer.

---------- Post added at 01:22 PM ---------- Previous post was at 01:21 PM ----------

I would have the commands placed into a text file or database. Tell the user it is done, but really they are in queue to be processed. A cron script then runs the commands in the database/flatfile at the time period of your choosing. This way you still use a lot of resources, however your user is not delayed and your cpu can be pushed during your slowest time of day instead of instantly.

This is probably the best solution, you would be able to run the queue at periods when the site is less busy.
 
0
•••
I would run a query. Have the script reload the page leaving off at the last query. This way the page is not sitting waiting to load the whole thing. Similar to what php scripts that import large .sql files do.
 
0
•••
Domain Recover
DomainEasy โ€” Payment Flexibility
  • The sidebar remains visible by scrolling at a speed relative to the pageโ€™s height.
Back