You are here:  » Cron jobs at different times for different feeds?

Active Forum Topics

Cron jobs at different times for different feeds?

Submitted by stonecold111 on Wed, 2013-08-14 08:33 in

From what I understand, the cron command on the support page allows me to pull the feeds all at once at a scheduled time. I only have 3 feeds to pull right now so it isn't a problem. But what if I have more than hundreds of feeds to pull in the future? Can I assign different cron jobs for different feeds at different times in order to avoid any overload or other issues?

Submitted by support on Wed, 2013-08-14 10:02


Sure that would be possible to set-up quite easily (I recently wrote a stepped version of cron.php that processes one feed at a time which can be scheduled to run, say, every 10 minutes) but alternatively if you were to find the process load was too high during import what a number of users do is insert a 1 second sleep every 100 products which greatly reduces load.

If you want to try this, if you edit includes/admin.php and look for the following code at line 399 (12/10A), 409 (13/03A), 415 (14/06A) or 535 (15/09A and later) :

  return ($admin_importProductCount == $admin_importLimit);

...and REPLACE with:

  if (!($admin_importProductCount % 100))
  return ($admin_importProductCount == $admin_importLimit);


Submitted by stonecold111 on Wed, 2013-08-14 10:37

Thanks for your suggestions, David.
Given that it only takes less than 2-3 seconds to proceed each of my feeds and it's unlikely that I'll have more than 50 feeds on my site, I guess I don't really need to concern about the process load at this moment~

Submitted by stevebi on Tue, 2015-04-07 07:57

Thank you very much for your modification David,

I have applied it since my cpu load time is very high during import.

Waiting until next cron

Submitted by stevebi on Wed, 2015-04-08 05:09

Hello David,

Just to point that memory decreased to 1.2 %, but CPU remained at 46.2%.

Is there any way to handle CPU?

Thank you for your support.



Submitted by support on Wed, 2015-04-08 06:39

Hi Steve,

The other thing besides importing that could increase CPU utilisation is if your site, which is know is very large, was being crawled rapidly. I just checked your robots.txt, and I notice there is no crawl-delay advisory so I would strongly recommend adding one for a very large site - 2 seconds would ensure that search engine crawlers don't cause any unnecessary load - you can do this by adding the following to robots.txt

User-agent: *
Crawl-delay: 2


Submitted by Actual on Mon, 2021-01-11 19:36

Apologies for bumping a very old thread. I am having the same problem as the original poster. I tried changing includes/admin.php which the suggestion of including a 1 second sleep after every 100 products, but it has made no difference.

I've checked my logs and at no point did my server run out of memory or have CPU beyond an acceptable limit. Is there any way I could cron a slow import? In the interim, I have had no choice but to turn off the cron job and run slow import manually (which works just fine).


Submitted by support on Tue, 2021-01-12 08:17

Hi Steve,

The Slow Import mechanism really isn't intended for automation although it might be possible to use a combination of 302 instead of meta/refresh and wget but what I would suggest to try first would be a more significant sleep between each feed (I assume you are experiencing server load issues during cron). To do this, edit scripts/cron.php and look for the import() call at line 93:


...and REPLACE with:


(or as required)


Submitted by MarcoCH on Tue, 2021-01-12 12:25

hello David

For me, the cron job always breaks off after 7-8 minutes. A new job starts all over again. So I can never update all feeds.

Is it a setting from the server that the job runs longer? Or where could the problem be?

Best regards

Submitted by support on Wed, 2021-01-13 08:06

Hello Marco,

The job starting all over again doesn't sound like normal cronjob behaviour; if that's happening I would perhaps suggest contacting your host as it may indicate a server configuration problem.

It is common, particularly in shared hosting, for a server to terminate a script that is consuming too many resources however with the sleep() periods as described above that shouldn't be an issue. I would contact your host and let them know the command line that you are using, advise that it appears to be terminating and restarting every 7-8 minutes and hopefully they should be able to advise...