I am having problems with my import timing out and have a couple of questions.
I import all my feeds on my local server, make any and all changes and then upload the revised feed to the feeds folder in PT.
First question:
To avoid the timeout problem, can I dump my local database and then upload that sql file directly into the database on the server?
If so, can I run multiple folders off of one table by specifying the same $config_databaseTablePrefix in the config files? That would get me down to a couple of uploads. Actually less than all the feeds.
If this can all be done, how would it affect the program in reference to the import, admin and setup process?
I am trying to get away from a dedicated server since that seems to be the only option left. At least until I can afford it.
Thanks,
Al
That was why I was wondering about the subfolders using the same table. It would keep my upload to a bare minimum. I'll let you know what happens.
I'll gladly take the slow import tool.
Thanks for the awesome support.
Al
Hi David,
Let me know when the slow import tool is ready also. When I import the updated feeds at night my CPU usage goes way up. My host has moved me to a high resource server until I resolve the issue. I'm on a semi-dedicated server now but they want me to move it to a dedicated server. I can't afford it right now.
How is the slow import tool coming? Just moved my site to a new host and having server load issues already. Is there a way that import.php could compare the new datafeed with whats in the database and only delete and import changes. My hosting company say this may cause less stress to the server than deleting and importing larges amounts of data.
Hi Adrian,
Apologies for the delay over the slow import tool. Something however that other users have found to reduce server load is to add a sleep() call to the import process every n products. To do this, look for the following line of code at the end of the import record handler function in includes/admin.php (line 297 - original distribution, 344 latest distribution):
return ($admin_importProductCount == $admin_importLimit);
...and replace this with:
if (!($admin_importProductCount % 100))
{
sleep(1);
}
return ($admin_importProductCount == $admin_importLimit);
The numbers can be adjusted to suit, but i'd start with this and see if it's acceptable...
Cheers,
David.
Ok I have that already. I changed it to sleep(10) but maybe I should reduce the number of products imported before it sleeps. Think that will help?
Hi,
If it was only 1 second, yes increasing it should certainly help - and your host should be able to confirm that nowhere near the same level of resources are being consumed...
Cheers,
David.
I changed it to sleep(10) every 100 products and it still cause very hight load on the server. I'll try sleep(10) every 20 products and see what happens. My host don't want me to run the script with out contacting them. Thats how bad of a problem its causing.
Actually I have sleep(1) in import.php. I have to add it to admin.php
I have something alittle different. Do I replace all this:
if ($admin_importCallback)
{
if (!($admin_importProductCount % 100))
{
$admin_importCallback($admin_importProductCount);
}
}
return ($admin_importProductCount == $admin_importLimit);
}
with this:
if (!($admin_importProductCount % 100))
{
sleep(1);
}
return ($admin_importProductCount == $admin_importLimit);
Hi Adrian,
It's only actually code being added, nothing existing needs to be removed.
All you need to do is to add this code:
if (!($admin_importProductCount % 100))
{
sleep(1);
}
...right at the end of the function, so insert it on the line before this:
return ($admin_importProductCount == $admin_importLimit);
Cheers,
David.
Thanks. I tested it out using one of my largest datafeed 600,000 products. The server load was real high for about 3 or 4 minutes then once I saw products being imported it went down. So I think the deleting process is causing tthe high load.
Hi Adrian,
It should be possible to reduce the server load during the deleting process in a similar way; however if this part of the process is taking up to 5 minutes for 600K products, the optimal count and sleep values will probably need some experimentation in order to come up with something practical.
Anyway, to try this; the deleting process is currently performed by the following code starting at line 419 of includes/admin.php:
$sql = "DELETE FROM `".$config_databaseTablePrefix."products` WHERE merchant='".database_safe($admin_importFeed["merchant"])."'";
database_queryModify($sql,$insertId);
To insert a sleep of 1 second every 50K products, try replacing the above code with the following:
do
{
sleep(1);
$sql = "DELETE FROM `".$config_databaseTablePrefix."products` WHERE merchant='".database_safe($admin_importFeed["merchant"])."' LIMIT 50000";
} while(database_queryModify($sql,$insertId));
Hope this helps,
Cheers,
David.
Hello Al,
Importing locally and uploading an SQL file to your server is certainly an option; but it is very likely that the same resource restrictions will apply and whatever method you are using to inject the SQL into the online version will also time-out, but it's certainly worth trying.
There's no problem at all with multiple installations (folders) accessing the same database - that can work fine - just use identical database configuration values in each installation.
I'll certainly let you know as soon as the slow import tool is ready as this should enable a full import to work on even the most basic hosting account...
Cheers,
David.