You are here:  » Automatic update for AW

Support Forum



Automatic update for AW

Submitted by kajoku on Thu, 2007-08-09 02:40 in

Hi David, can you please tell me exactly how I use the import.php file, I would like to have my modified AW feeds uploaded automatically, please assume I don't know anything, cause I read the thread but did not get exactly how to execute the command.

Still learning this stuff :)

Cheers
K

Submitted by support on Thu, 2007-08-09 09:48

Hi,

Have you read through the automation guide?

http://www.pricetapestry.com/node/198

The above page is the most comprehensive guide there is to setting up automation and scheduling import.php, but there is quite a lot to take in. What I would recommend is taking it in steps. First, get the automated downloading working using the commands to schedule calls to wget etc.

Once that is working, then try using import.php as described in the thread to import all modified feeds, and then finally put it altogether into an automation script which you then setup to be called by cron.

First things first however, do you have shell access to your hosting account. Normally, you would do this by logging in over Telnet, or more commonly these days SSH... If not, what access do you have to your hosting account, and do you know if you are allowed to run CRON jobs?

Cheers,
David.

Submitted by kajoku on Thu, 2007-08-09 11:28

Hi David, thank you very much for your prompt reply. I have access to control Panel where I can set up cron jobs. Yes I am allowed to run CRON jobs.

What do you suggest that I do now please?

Regards
Ken

Submitted by support on Thu, 2007-08-09 11:46

Hi Ken,

If you have control panel but not shell access, it is possible that you would need to write a PHP method of automation rather than using a shell script. There are quite a few steps to this so we'll take it one step at a time - not only is there the PHP required but we also need to check that PHP as the necessary access to the feeds directory.

Firstly, you need to make the /feeds/ directory of your site writable by PHP. You might be able to do this via your FTP program. Login to your site, and then RIGHT-CLICK on the /feeds/ directory and see if you can find a form or dialog box for setting the permissions (it might be on properties or something like that). When you have found the properties, enable WRITE ACCESS for all users (basically, check all the write boxes).

Next, create the following file in your /admin/ directory to test the access. We'll use the filename "automate.php", which will eventually be what you call via your CRON job.

automate.php

<?php
  $feedsDir 
"../feeds/";
  if (
is_writable($feedsDir))
  {
    print 
"PHP HAS WRITE ACCESS!";
  }
  else
  {
    print 
"PHP DOES NOT HAVE WRITE ACCESS!";
  }
?>

Once this is working, and says "PHP HAS WRITE ACCESS!" we'll build up the script to download and import your feeds automatically. The final step will be to setup a CRON job to call this script automatically....

Cheers,
David.

Submitted by kajoku on Thu, 2007-08-09 16:43

Hi David, yes I have the message "PHP HAS WRITE ACCESS!" Can you tell me the next steps please.

Regards
K

Submitted by support on Thu, 2007-08-09 17:26

Hi Ken,

That's great. Here's stage 2 test script. This will test 2 methods of downloading a feed from the Internet to your feeds directory. The first method tries the WGET command, which might be installed on your server, although even if it is, PHP might not have the permission to access it. If that test fails, it then tries using PHP's standard fopen() function against the URL. The second test has more chance of working. If that fails, I'll be able to tell you want you need to ask your host in order to enable it.

Please replace the original automate.php with the new version below, and post what output you get. This will tell us what method to use when automating the downloading of your feeds.

automate.php:

<?php
  header
("Content-Type: text/plain");
  
$feedsDir "../feeds/";
  if (!
is_writable($feedsDir)) { print "Cannot write to ../feeds/";exit(); }
  
$testURL "http://newsrss.bbc.co.uk/rss/newsonline_uk_edition/front_page/rss.xml";
  
$testFilename "bbc.xml";
  @
unlink($feedsDir.$testFilename);
  
$cmd "wget -O ".$feedsDir.$testFilename." \"".$testURL."\"";
  
exec($cmd);
  if (@
filesize($feedsDir.$testFilename))
  {
    print 
"WGET method works!\n";
  }
  else
  {
    print 
"WGET method failed, now testing FOPEN method...\n";
    
$input fopen($testURL,"r");
    
$output fopen($feedsDir.$testFilename,"w");
    while(!
feof($input))
    {
      
fwrite($output,fread($input,1024));
    }
    
fclose($output);
    
fclose($input);
    if (@
filesize($feedsDir.$testFilename))
    {
      print 
"FOPEN method works!\n";
    }
    else
    {
      print 
"FOPEN method failed.\n";
    }
  }
  print 
"Done.\n";
?>

Cheers,
David.

Submitted by kajoku on Thu, 2007-08-09 19:57

Hi David, thanks for that. This is the result I got "WGET method works!"
Done.

Submitted by support on Thu, 2007-08-09 20:22

Hi Ken,

That's excellent, as WGET is the most robust way to download the feeds! I'll pull together the next stage first thing tomorrow for you...

Cheers,
David.

Submitted by support on Fri, 2007-08-10 09:14

Hi Ken,

Here's a complete version of automate.php that contains a function for downloading feeds, and then a call to the import script. Please run this as a last test and copy the output you get. You should see something like this at the end:

backfilling reviews...[done]
Done.

If this doesn't work, no problem, there is an alternative method to importing the modified feeds. Here's the script:

automate.php

<?php
  header
("Content-Type: text/plain");
  
$feedsDir "../feeds/";
  if (!
is_writable($feedsDir)) { print "Cannot write to ../feeds/";exit(); }
  function 
wget($filename,$url)
  {
    global 
$feedsDir;
    
$cmd "wget -O ".$feedsDir.$filename." \"".$url."\"";
    
exec($cmd);
  }
  
wget("bbc.xml","http://newsrss.bbc.co.uk/rss/newsonline_uk_edition/front_page/rss.xml");
  print 
"Importing modified feeds...\n";
  
$cmd "php ../scripts/import.php @MODIFIED";
  
passthru($cmd);
  print 
"Done.";
?>

For now, this will download the BBC News RSS feed as bbc.xml. If this works, the only change to this file is simply multiple copies of the wget(filename,url) line for each feed. Once working, and we know that timeout is not a problem, we'll look at how you run this script via CRON.

Cheers,
David.

Submitted by kajoku on Fri, 2007-08-10 09:28

Hi David, this is the result i get

"Importing modified feeds...
Done."

Is this ok. If so what shall i do next pls.

K

Submitted by support on Fri, 2007-08-10 10:42

Hi Ken,

That's good - it means we're basically there, save for setting up the CRON job. Before doing that, it is best to check that you can now use this script to download your feeds successfully and import them without any timeout problems.

In your latest automate.php, you will see this line:

  wget("bbc.xml","http://newsrss.bbc.co.uk/rss/newsonline_uk_edition/front_page/rss.xml");

What you need to do now is remove that line, and replace it with multiple calls to your Affiliate Window feed URLs (make sure they are not the zipped versions!). For example:

wget("EmpreDirect.xml","http://www.example.com/feeds/get.asp?merchant=123&username=ABC&password=XYZ");
wget("Ebuyer.xml","http://www.example.com/feeds/get.asp?merchant=456&username=ABC&password=XYZ");

Replace the filenames and URLs with the ones you are using on your site, and then run the automate.php. It should fetch the new feeds, and then import them. I would recommend just trying one feed to start off with, then add more once it's working. Once you're happy this is working, the last step is to make it a CRON job...

Cheers,
David.

Submitted by kajoku on Fri, 2007-08-10 18:01

Hi David thanks for that, can you tell me pls what I would need to add to the code and where to add it to ensure that the unzipped .xml files will be selected as opposed to .xml zipped or CSV files that AW has. Also can you please give me the rest of the instructions for the cron job pls.

Many thanks
Ken

Submitted by kajoku on Fri, 2007-08-10 18:08

BTW I got a message
Importing modified feeds...
Done.

this was when it was done with one feed

the original file was over written and now has got 0k displayed

Submitted by support on Fri, 2007-08-10 18:11

Hi Ken,

Can you post the wget(...) line that you added - remembering to remove your userID and password if they are in the URL...!

Cheers,
David.

Submitted by kajoku on Fri, 2007-08-10 18:19

Sure here it is

wget("aw_Empire_Direct.xml","http://www.deal-snatcher.com/applemac/feeds/get.asp?merchant=272&username=xxxx&password=xxxxxx");

Submitted by support on Fri, 2007-08-10 18:27

Hi Ken,

Can you double check the URL - I just tried it directly and got 404 (Not Found), which is why it would download zero bytes...

Cheers,
David.

Submitted by kajoku on Fri, 2007-08-10 19:10

Opps - Your'e right, can I ask at the point merchant= should that be the actual merchant or affiliate window stuff, BTW how will the code know that the feed is coming from affiliate window?

K

Submitted by support on Fri, 2007-08-10 19:14

Hi Ken,

I'm not totally shure what you mean there as I've always obtained Affiliate Window's feeds directly from Affiliate Window.

Basically, you want to use whatever URL you have been using before to download the feeds, all we're doing here is automating what you've been doing manually, so simply use the same feeds and it should work fine...

Cheers,
David.

Submitted by kajoku on Tue, 2007-08-14 23:02

Hi David, sorry been bogged down with work over the last few days.

Can I ask you please whether if this would be the kind of line of code that I would need to enter for AW, sorry about this, I just want to get it right.

"aw_Empire_Direct.xml","http://www.affiliatewindow.com/affiliates/shopwindow/datafeeds.php/get.asp?merchant=272&username=xxxxx&password=*****

FORMAT
"feedname","aw feed url//get.asp?merchant=merchantnumber&username=xxxxx&password=*****

If not what do you suggest please.

Regards
Ken

Submitted by support on Wed, 2007-08-15 08:33

Hi Ken,

That looks exactly right - your script should contain a line like this:

wget("aw_Empire_Direct.xml","http://www.affiliatewindow.com/affiliates/shopwindow/datafeeds.php/get.asp?merchant=272&username=xxxxx&password=*****");

You can always test it by entering the URL into your browser manually and making sure you get the feed (uncompressed), but I think you'll find it works fine.

Cheers,
David.

Submitted by kajoku on Wed, 2007-08-15 14:18

Hi David, done what you suggested.

When I run the script I get a message

Importing modified feeds...
Done.

which seems to be fine, however when I look at the file size its only 15.4kb, I also downloaded the file manually by logging into my account and going through the manual procedure, the file size manually downloaded was 4.25mb,
So I don't think the automated way is working the way we expect it to. Any suggestions please??

Regards
K

Submitted by support on Wed, 2007-08-15 15:05

Hi Ken,

If the file is 15.5kb, that implies that it is a HTML document probably containing an error message generated by Affiliate Window which should help diagnose the fault.

Can you try looking at the file? You should be able to download it to your local computer and then open it in a text editor. If you're not sure feel free to email the file to me and i'll take a look for you (reply to your reg code or forum registration email)...

Cheers,
David.

Submitted by kajoku on Wed, 2007-08-15 17:29

Hi David I have sent the file via email as you suggested. Please can you have a look and let me know your comments.

Thanks
K