You are here:  » What if I have a Commission Junction feed with multiple merchants?


What if I have a Commission Junction feed with multiple merchants?

Submitted by robertuva on Thu, 2006-10-05 18:52 in

How do I separate those out so that it doesn't throw them all in as the same merchant?

Also, do you have the capability for subcategories?

Submitted by support on Thu, 2006-10-05 19:18

Hi!

See the following thread for some solutions to the CJ multiple merchants :

http://www.pricetapestry.com/node/167

The easiest solution is the "Drop Record If NOT" filter - simply copy the file multiple time (one per merchant) and then register / import using filters to include only one merchant per copy.

I'm afraid Price Tapestry does not support sub-categories. This is simply because very few affiliate product feeds provide good quality category data, let alone sub-categories.

If you search the forum (search box here - use the site search option) for Commission Junction you will find various thread regarding using CJ feeds with Price Tapestry - including how to separate out the category value into a single category for import.

Hope this helps!
Cheers,
David.

Submitted by stevlam on Fri, 2006-11-17 15:14

Hi David,

The CJ feeds are causing me a headache!

I receive a zip file containing all the feeds.

I get access to the feed by ftp to CJ's ftp site.

Is there a way I can automate this and access the file, copy it from the CJ ftp folder and then extract the contents to the feeds folder on my server?

thanks for any help

Submitted by support on Fri, 2006-11-17 15:39

Hi Steve,

If you are running Linux, and have the ability to upload a shell script to your account which you can setup to run via CRON then yes, you should be able to automate this.

The majority of what you'll need to do is covered in the Automation Guide:

http://www.pricetapestry.com/node/198

Notice that the feeds themselves are retrieved using the wget command. Wget supports FTP just as well as HTTP, so you could use a command similar to the following to download your CJ feeds (and then unzip them):

/path/to/wget -O "/path/to/pt/feeds/cjfeeds.zip" "ftp://username:password@www.example.com/path/to/feeds.zip"
/path/to/unzip /path/to/pt/feeds/cjfeeds.zip -d /path/to/pt/feeds/

(where username and password are your CJ FTP username and password - make sure the wget command is all on one line, not broken as above)

Hope this helps,
Cheers,
David.

Submitted by bwhelan1 on Sat, 2007-03-24 05:09

David,

CJ states that they will provide their feeds one of two ways:

One zip file containing one text file with all merchants

OR

One zip file containing separate text files for all merchants.

Is their any way you know of to automate the extraction of a single text file from the zip file? I'm using a Linux webhost.

Bill Whelan

Computer Store

Submitted by support on Sat, 2007-03-24 07:59

Hi Bill,

If the filenames are encoded correctly in the multi-file zip file this is the one you want to use. However, in your automation script instead of the current method of (taken from your post in the automation thread):

/bin/gunzip -c /home/xxxxxxxx/public_html/feeds/10851.zip > /home/xxxxxxxx/public_html/feeds/10851.txt

...it would need to look something like this:

/bin/gunzip /home/xxxxxxxx/public_html/feeds/10851.zip

In other words, not using the -c parameter any more and just letting the file extract as stored. Hopefully, the file names are sensible, but there is a small chance that they will extract themselves into a new directory so look out for that happening...

Cheers,
David.

Submitted by bwhelan1 on Sat, 2007-03-24 18:29

The problem I have is that the zip file contains separate text files for ALL of the various merchants, it resides on the Commission Junction FTP site and it's quite large.

I don't want to have to download the whole file to every site I have using your script then having to script the deletion of every file contained in that zip file for every site. If that were the case, I would have to make changes to fetch.sh every time a merchant was added not to mention the wasted bandwidth and the disk space needed for each site just to be able to extract the file once a month when the import is run.

Perhaps I could setup on of my sites to download the CJ zip then extract and zip each merchant file into separate text files using the existing name of each file? Any ideas on how to do that?

And if that works, since all the sites are on the same box, would I be able to transfer the files between the home directories of each domain with the script so I don't have to use FTP and consume bandwidth? There is a separate ID/Password for each account.

Bill Whelan

Computer Store

Submitted by support on Sat, 2007-03-24 18:55

Hi Bill,

Perhaps one option is to create a master download directory on one of your sites (it doesn't really matter which one) and schedule a standalone script to download and unzip the CJ file into that directory.

Then you can setup your automation scripts to "download" the files directly from your master download directory. Because they are on the same server you won't consume any bandwidth, even though you will be using a wget command.

Your master download script would look something like this (assuming a directory called cjfeeds on site1.com:

/usr/bin/wget -O "/home/site1.com/public_html/cjfeeds/cj.zip" "ftp://username:password@www.example.com/cj.zip"
/bin/gunzip /home/site1.com/public_html/cjfeeds/cj.zip

Now, assuming that you schedule the above for, say, 2AM, then on your other sites you schedule scripts similar to the following for 3AM:

/usr/bin/wget -O "/home/site2.com/public_html/feeds/merchant1.xml" "http://www.site1.com/cjfeeds/merchant1.xml"

(where merchant1.xml is one of the files extracted from cj.zip in the master download directory)

Hope this helps!
Cheers,
David.

Submitted by bwhelan1 on Sat, 2007-03-24 20:06

Thanks again David,

I'll give it a shot.

Bill Whelan

Computer Store

Submitted by bwhelan1 on Mon, 2007-03-26 14:29

David,

For anyone that may have the same question I did about bandwidth, below is the response I got from you hosting company:

Transferring Files Between Accounts
Hi Bill,
Thanks for your patience on this matter. I am afraid that you would still utilize bandwidth on the account in which you are downloading the file from. This is because WGET will be downloading via a HTTP link which cpanel will log as bandwidth. The one advantage to using wget vs FTP is that it will only consume bandwidth from the account you are downloading from and not the account that is calling the wget file and receiving the file.
Regards,
Mark
WebHSP Support

Bill Whelan

Computer Store

Submitted by crounauer on Mon, 2007-10-01 15:36

Hi,

This is an additional comment to retrieving CJ feeds via ftp.

The problem I encountered was that is was made available once a week and the filename would always change.
I.e. it would end in the date that is was generated on...

200###_#####_20071001.zip

To resolve this I created a shell script as above but added a function that would automatically add the date, so that the filename would always be correct. This is the shell script.

<?php
#!/bin/sh
#########
TODAY=$(date +"%Y%m%d")
#########
/usr/bin/wget -"/var/www/####/####/cjfeeds.zip" "ftp://username:password@####.###.com/####/###/####/200###_#####_$TODAY.zip"
/usr/bin/unzip /var/www/####/####/cjfeeds.zip -d /var/##########/datafeeds/
?>

Hope this helps someone!

Computer Hardware

Submitted by support on Mon, 2007-10-01 15:39

Thanks for the tip, Simon!

Submitted by clare on Tue, 2007-10-02 19:44

I wrote to CJ recently asking how to download individual merchant datafeeds and got their reply today which may be relevant in this thread..

if you would like to set up each data feed individually we will have to charge you $200 one time fee. But we are working currently on new feature available for publishers where you will be able to download require data feeds through CJ interface yourself. You should see announcement about it in your CJ account shortly.

So I dont know how soon "shortly" is, but sounds like they are working to make this individual feeds available.