Hello,
I bought the software today, have installed successfully, and have uploaded my first two feeds. I have a couple of questions about the feeds:
1. Does each merchant feed have to be uploaded individually? Webgains for example can give you combined merchant feeds, but the Register process merchant name field suggest that each feed should be from one merchant only.
2. Can I use live feed URL(s) instead of csv files?
3. If I can only use csv files, what's the minimum amount of time between updates that webmasters generally agree on?
This is the latest of several scripts I've tried. Like what I see so far!
Regards,
Ben
Hi David,
Thanks for the answers. I can see from your replies that it's best to start with a niche and pull in datafeeds from each affiliate network that I belong that relate to that niche, otherwise it's going to be too big to manage. Even with some of the more common niches there may be 100+ feeds to update each week.
Best get started then ;-)
Regards,
Ben
I just thought I would add a comment to this post from my personal experience. Most feeds are updated weekly or twice weekly by the merchants so I therefore update my main feeds twice a week. Even though I use a dedicated server I still split the fetching into two main files due to the number of feeds I download and to reduce the load on the server. My database has currently got around 1,500,000 products in it so I need to split them in this way. I also have several RSS product feeds which I use a separate fetch file for and are updated daily as content changes more often on them. With all three files they still use the same basic system and XML is preferable to CSV as a general rule.
I hope this helps you.
Kind regards
Stephen
The Big Business Directory
www.the-big-business-directory.co.uk
www.the-big-business-directory.com
Hi Stephen,
Thanks for the tips. You're right about XML; I've just used XML files instead of the CSV files I had used on first attempt and now there are far fewer errors and no need to place filters for missing prices for example, so I'll stick to XML from now on.
So the first site build is the most time consuming part, but after each feed is registered all I have to do is download an update each week, upload to site and import all, is that right?
Do you have any shortcuts e.g. download/upload as zip and then unzip on the server? This might save a few seconds per file but with so many files that could add up to an hour or two a week.
1,500,000 products! Blimey, you must do this all week!
Regards,
Ben
Hi Ben,
Once a feed is registered, each time you update the feed you only need to upload the new feed and then import again. Registration only needs to be done once.
Do have a look at the automation guide which has lots of tips on setting everything up to download and import automatically...
http://www.pricetapestry.com/node/198
Cheers,
David.
As David said above, you register each feed once and the fetching is done automatically via cron jobs.
My site is only very new and is, to my mind, very slowly getting on to the search engines. I know this always takes time, although Google has now added over 3,000 pages this week, which is nice! My site is in its infancy and as yet unproven to the search engines in terms of time and reliability. After all it is not as important to the search engines in the grand scheme of things when indexing billions of pages as it is to me, but this is always going to be the case!
My feeds all download unzipped, even the feed with over 260,000 products in it. If you are doing that you will really need a dedicated server as the shared hosting will, most likely, timeout at some point. My dedicated server includes MySQL databases of unlimited size, all for £50 + VAT per month. Obviously initially you will lose money on this until the site gets more traffic to it if you operate, as I do, from organic search engine listings only with no PPC. I am aiming to have 6,000,000 products on my site by summer next year and, hopefully, 10,000,000 by Christmas next year!
Kind regards
Stephen
The Big Business Directory
www.the-big-business-directory.com
www.the-big-business-directory.co.uk
Hi David,
Thanks for the tips. I do rent a server already so that part is taken care of, and I've got the cron jobs Paid on Results and will try all other networks that I'm currently registered with. I've gone for a niche area first (clothing) and once I've learnt the ropes with that I'll try something a bit more ambitious.
I hope you succeed in your plans. Out of interest, how many hours do you spend on this each week? This isn't my main occupation so time is limited.
Regards,
Ben
Hi Ben,
Thank you for your comments.
In terms of time spent on my own Price Tapestry sites (excl. the demo site) it probably works out at about a day a week - most of that managing merchants as they come and go from the networks...
Cheers,
David.
Hi David,
Thanks for that. Actually, my last post was directed at Stephen but I posted your name by mistake!
Regards,
Ben
Ben,
Join the club!
I work upto 60 hours a week in my full-time job and work around 40-50 hours a week on my websites. Yes, that is a lot of hours!! If it all works out it will be worth it!
Kind regards
Hi Ben,
To answer your questions;
1/ You need an individual feed for each merchant. Not only does this make managing your site easier, it also makes it less likely that you will experience timeout problems when importing.
2/ If by Live feed URLs you mean API (application programming interface) access to products, this is not something that is supported by Price Tapestry. The script is only designed to use product feeds.
3/ (remember that you can use XML as well as CSV, which is preferable for a number of reasons). Personally, I update merchants once per week, however you may come across merchants who indicate what frequency they would like you to update in their program information etc., but this is rare. Some users update everyday, or to manage resources on their server update some feeds everyday so that over the course of a week all merchants are updated.
Thanks for your comments - good luck with your site!
Cheers,
David.