limited, where no access to the terminal and where the manager of my domain doesn't want to give me the technical data, so I don't know the Ram of my server.
But my server crashes due to my latest project, technical support says "resources are not enough" but I don't understand what the problem is and how I could fix it. Maybe it depends on the Ram?
I am trying to run 250 php files at the same time but I am getting these errors:
[error] (11)Resource temporarily unavailable: [client xxx. xxx .xxx. 210] - www. website .com - couldn't create child process:
[error] (11)Resource temporarily unavailable: [client xxx. xxx .xxx .210] - www. website .com - AH01223: couldn't spawn child process:
The execution of each file takes just under 10 minutes. In the script there is a for loop with 270 iterations.
At each iteration I make a cURL request to the API of a website, and use a proxy list. Each request lasts about 2-3 seconds and I need to extrapolate a very light json (about 10kb).
Is there a way to figure out how much RAM each run consumes? Unfortunately I don't understand anything about servers and I don't know how to solve and what to look for (which technical characteristics, RAM, cpu, etc.).
Unfortunately I cannot modify my project, in short, I must necessarily have 258 files that are executed simultaneously.
someone can help me ?
sounds like your hosting provider sucks
Yes sure. Is there a telegram group where i can ask help and pay programmer to solve my problem ?
so one file does 10 minutes of execution? do i understand it right?
in practice I have to make requests to the API of a site (there are 258 different links). Each link / request returns a different json. This json can change all the time so the more requests I make the better it is for me. To automatically execute each script (I have 258 all identical, where only the link in the cURL request varies) I use cronjobs (a paid service). Currently, based on how much I pay, I have about 40 thousand cronjobs per day, so in order to make as many requests as possible I run the 258 scripts every 10 minutes. (1440 * 258 = 371520 cronjobs / executions per day). In each file there is a for loop with 270 iterations. At each iteration I make a cURL request to the website's API (to avoid blocks I use a proxy list). If it helps yesterday I used memory_get_usage and each script return 0.9 Mb. The strange thing about this is that for about 20 days everything worked perfectly. The problem is not the code, in fact if I run it individually it works fine. I wonder ... could it be that the server has a limit of concurrent executions? and if so, why did it work for 20 days? Furthermore, it may be that the database (in which i saved some data) is too full and therefore these errors have now appeared (500 status code).
Обсуждают сегодня