You are here:  » Sharing Feeds on multiple website via Automation and Filter


Sharing Feeds on multiple website via Automation and Filter

Submitted by babyuniverse on Mon, 2008-12-15 05:06 in

Hi All,

I have several websites that all share the same data feeds, for instance I download a data feed and then upload it to website 1, i then download it again and upload it to website 2 and so on. The reason for this doubling up is that the data feed has an affiliate id which is unique to 1 url.

Example Feeds
Website 1 - http://www.example.com/Sale.aspx?BID=81343&AfID=WEBSITE1&AdID=9240
Website 2 - http://www.example.com/Sale.aspx?BID=81343&AfID=WEBSITE2&AdID=9240
Website 3 - http://www.example.com/Sale.aspx?BID=81343&AfID=WEBSITE3&AdID=9240
Website 4 - http://www.example.com/Sale.aspx?BID=81343&AfID=WEBSITE4&AdID=9240
Website 5 - http://www.example.com/Sale.aspx?BID=81343&AfID=WEBSITE4&AdID=9240

Current Process
1 - Download Feed to local computer for each website
2 – Upload feed to each website
3 – import manually into each website

To save time and duplication I was thinking of doing the following (clix galore does not have a direct link to the feed)

New Process
1 - Download the one copy of the feed onto local computer
2 – Upload the file to a standard website
3 – Use automate script to fetch feed from standard website
3 – auto-import and use filter (Find and Replace) to change AfID to appropriate website during import.

Questions
Is this the best option to save time? Is there a better way to save even more time?
Which script would I use for automation http://www.pricetapestry.com/node/198 or http://www.pricetapestry.com/node/24 ?

I am assuming by me uploading into a standard website I can then access them directly using the script from say http://www.website.com.au/globalfeeds/813438.csv

Would appreciate any feedback or suggestions, I currently have 13 different websites all sharing all or part of the same feed.

Regards
Richard

Submitted by support on Mon, 2008-12-15 09:23

Hi Richard,

The instructions in http://www.pricetapestry.com/node/198 are the main details for setting up automation; so start there. However, if you find that you don't have the necessary access to your server, then you can look at using the PHP download script in the other thread.

The method you describe sounds reasonable (downloading a standard feed, and then using Search and Replace on the Buy URL to change AfID); the only caveat being that if each of your main websites is scheduled to run the script at a specific time and download the standard feed from your standard website, you would have to make sure that you had downloaded and uploaded all feeds manually before the CRON jobs were due to execute. That may not be a problem of course - schedule them all during the night; and make sure that the downloading / uploading is done during the day.

However, if each of the websites are on different servers, and therefore it is not an option to have one script that downloads the feed once and then copies the feed to the multiple installations; if may be just as easy to setup automation independently on each server; using the correct AfID for that server.

If you go down this route; and use the shell script solution described in the main automation guide; you could use an environment variable to mean that you don't have to keep editing the AfID - you can just set it once at the top of the script; for example:

AfID=WEBSITE1
wget -O /path/to/feeds/merchant1.csv "http://www.example.com/Sale.aspx?BID=81343&AfID=$AfID&AdID=9240"

Hope this helps!

Cheers,
David.

Submitted by babyuniverse on Wed, 2008-12-17 11:01

Hi David,

This is working fine using my fetch as follows, I have downloaded the feeds to mydomainname.com.au/allfeeds (this is a generic website to store the feed and distribute to all the other sites, allowing me to only download once)
And I have setup fetch.sh and a filter on website1.com.au to change the AFID (find and replace)to the appropriate id for website1.
I intend to copy fetch.sh to all my other websites changing only the path and the filter.

#!/bin/sh
#########
# FETCH #
#########
/usr/bin/wget -O "/home/website1/public_html/feeds/aboutstyle.csv" "http://www.mydomainname.com.au/allfeeds/aboutstyle.csv"
/usr/bin/wget -O "/home/website1/public_html/feeds/alwaysonsale.csv" "http://www.mydomainname.com.au/allfeeds/alwaysonsale.csv"
##########
# IMPORT #
##########
/usr/bin/php /home/website1/public_html/scripts/import.php @MODIFIED

Is this the ideal way, You mentioned above using

AfID=WEBSITE1
wget -O /path/to/feeds/merchant1.csv "http://www.example.com/Sale.aspx?BID=81343&AfID=$AfID&AdID=9240"

Not sure how I can utilise this as the feed name doesnt have the AFID in it and are basically only name.csv?
Can I change import.php on each side to automatically change the AFID to the appropriate id?

So far you have saved me a heap of time

Thanks Richard

Submitted by support on Wed, 2008-12-17 12:12

Hi Richard,

The method I was referring to would only be applicable if instead of downloading to your common site you decided to download the original feed (with different AfID) on each independently - that's all!

If you find it's not too much trouble managing the filters etc. there's nothing wrong with how you are doing it at all, so probably easiest to keep with that...

Cheers,
David.

Submitted by sbedigital on Sun, 2008-12-21 14:50

If all of your sites are on server, why not run all the of them using single data base? I do it with my 11 different sites and they all have their own purposes. All you need to do is keep the database information same for all of your site in config.php file and modify the site relevant details on each of your sites. for example:

<?php
  $config_title = "example1.com";
  $config_charset = "utf-8";
  $config_baseHREF = "/";
  $config_useRewrite = false;
  $config_useRelated = true;
  $config_useTracking = true;
  $config_useJavaScript = true;
  $config_useInteraction = true;
  $config_currencyHTML = "&pound;";
  $config_resultsPerPage = 20;
  $config_databaseServer = "localhost";
  $config_databaseUsername = "db_user1";
  $config_databasePassword = "db_pass";
  $config_databaseName = "db_name";
  $config_databaseTablePrefix = "";
?>

<?php
  $config_title = "example2.com";
  $config_charset = "utf-8";
  $config_baseHREF = "/compare/";
  $config_useRewrite = false;
  $config_useRelated = true;
  $config_useTracking = true;
  $config_useJavaScript = false;
  $config_useInteraction = true;
  $config_currencyHTML = "&pound;";
  $config_resultsPerPage = 10;
  $config_databaseServer = "localhost";
  $config_databaseUsername = "db_user1";
  $config_databasePassword = "db_pass";
  $config_databaseName = "db_name";
  $config_databaseTablePrefix = "";
?>

As you can see both sites are using the same database but they have different configuration seeting like, example2.com is in /comapre/ directory and it shows 10 product in the listings wheres, example1.com is in base directory of the server with java script set to enabled and shows 20 listings.

When it comes updating you can just upload your feed on any of the hosts and register the product in normal manner. Also I would create a backup database on the server, purly because, when you are updating the feed I just set the databese to backup one once updating is finished I switch back to the proper one. In this way you site is always up and you have up to date backup all the time.

Submitted by babyuniverse on Tue, 2009-01-06 10:04

Hi Everyone,

I have managed to get the autoupdates working as required, One question - Can I alter the fetch.sh to look at the file size and only copy and import if it is has changed. I know the @modified is doing that for the import part, what about copy?

#!/bin/sh
#########
# FETCH #
#########
/usr/bin/wget -O "/home/*****/public_html/feeds/abiandjoseph.csv" "http://www.domain.com.au/allfeeds/feed1.csv"
/usr/bin/wget -O "/home/*****public_html/feeds/aboutstyle.csv" "http://www.domain.com.au/allfeeds/feed2.csv"
##########
# IMPORT #
##########
/usr/bin/php /home/*****/public_html/scripts/import.php @MODIFIED

I am using the code above to fetch 50 files and import the changed files, I have found I am using a lot of bandwidth because I am constantly fetching files that havent changed. (I also mistakenly set cron to run fetch every 5 minutes :) so that didnt help)

Any idea on using WGET to only look for file differences?

Thanks for your help. I will definately look at using the same database, one issue with that I see is that each site has a different affiliate number which I am currently changing using search and replace during import.

Submitted by support on Tue, 2009-01-06 11:51

Hi Richard,

Can you post an example of the part of your script that does the copying? It should be possible to do something by copying the file to a temporary file, then comparing the size after the download...

Cheers,
David.

Submitted by babyuniverse on Wed, 2009-01-07 10:41

Hi David,

My script is posted above, basically I am using the fetch.sh script to fetch it from my other website (copy).

I upload it to 1 website and then use fetch.sh on 6 other websites to basically fetch or copy the feed each night.

Some feeds change daily other only weekly, however the fetch.sh script does do any check before fetching.

Hope this makes more sense

Richard

Submitted by support on Wed, 2009-01-07 12:57

Hi Richard,

Ah - that makes sense. That's not so straight forward i'm afraid as there is no easy way to keep track of the file size on the remote server.

Could I make a suggestion; in that is it possible that were just a price to change, the actual size of the file may be identical; whilst containing more up to date data - so perhaps it may be worth always copying the latest version even if it is the same size?

Cheers,
David.