Im running a campaign for a coupon site, coupons are added and expire weekly. What are the most important factors you think I should focus on
-
Coupon sites are unique as the content is fluid, coupons expire weekly, categories have thin content but the main goal is to get the merchants Macy's, Target, Best Buy coupons and indexed. Im looking for advice on technical SEO that may be unique to feed and affiliate site, what to block in the robots.txt ie affiliate urls such as /go and /gostore and categories Any advice and passionate discussion is welcome...
-
Hi.
I'm not quite sure why you'd block anything in robots.txt. I hope you do understand how robots.txt works and the differences between meta robots, robots.txt, 301 redirects and canonicalization. If you are not so sure - here is a whiteboard friday about that: https://moz.com/blog/controlling-search-engine-crawlers-for-better-indexation-and-rankings-whiteboard-friday
I don't even know what you mean by "blocking affiliate URLs" in robots.txt and I'm quite sure that's not possible. About blocking categories - so, you wouldn't want them to be ranking? If so, still, I'd use meta robots NOINDEX, FOLLOW.
About meta descriptions - they are pretty much just for improving CTRs and not for rankings, so, if you're happy with snippets having just "store name, text, date of month" - then just do that.
However, "unique meta descriptions and titles and content on each merchant page to compete with the big boys" - this does make sense and does take place. You for sure want to have unique content to have any chance.
Hope this helps.
-
As someone who works in this space I'm not going to give away all of my secrets (sorry), but I can advise you to ask the following questions and consider the answers carefully when deciding what to do:
What are you offering on any given page that is unique or better than what the next coupon site offers? Are you saying something insightful or helpful about Macy's on the Macy's page?
What does the link/text ratio look like on that page?
Is the coupon content 100% unique to you, or are you pulling that content in from a feed that also services other coupon sites? (Basically, do you have a large scale duplicate content issue across websites you don't control?)
Are coupons listed on a merchant's page with each also existing as an individual page for each coupon, potentially creating a thin and duplicate content situation on site?
If you want individual coupons to be indexed, should they remain indexed once they've expired?
Are you using a redirecting URL to mask/beautify/track the affiliate link? Do you want that link to be indexed even though it's not actually a page with content? (Sounds a bit crazy but we've run into issues with that in the past, just like how bit.ly links can get indexed sometimes, defying all logic.)
(eg. yoursite.com/couponclick/1234 --> super ugly affiliate link --> merchant page)I'm going to be bluntly honest here, and not because I don't want the competition but just as a 5 year veteran in this niche and because you should know what you're getting into:
The coupon space is extremely difficult to break into. It is literally impossible emphasize this enough. The big players are very well established. Most of them aren't just offering coupons, but what I think of as "coupons+" (coupons + deals, coupons + cash back, coupons + charitable donation, etc.) and users respond to that and share it. RetailMeNot is deluged with high authority inbound links every time their stock moves a penny. Thin and duplicate content is endemic thanks to the ease of installing feeds, and making that content not just unique but also authoritative is easy to recommend but crazy difficult to implement at scale. Linkbuilding is next to impossible since no one really links to a Macy's coupon page, let alone a page for 2nd and 3rd tier stores without some kind of financial incentive, and Google gives no link credit for affiliate links (which should be nofollowed anyway). Users mostly aren't loyal to one site or another, they just want the coupon so they can get $5 off of some UGG boots. And when newcomers aren't able to cope or compete given all these difficulties, they often resort to blackhat tactics, making it even harder for those of us who keep things white hat to stay competitive (though Google has been much, much better about booting that crap since the Payday Loan update). Really, titles and metas, while it's important to get them right, are going to be the least of your challenges. Be thoughtful about the content, be user-friendly, find a way to stand out and you might have a shot.
-
Dmitrli,
Regarding the blocking of affiliate URLs in the robots.txt file, you can and probably should do this, by first running those through an internal link redirect. For example:
If the Merchant gives you the following URL:
http://www.merchant.com/affiliate-id=123&landing-page=abc
You wouldn't show that URL to Google or users. Instead, you would show them:
http://www.youraffiliatesite/affiliate-link?id=123
And then in your robots.txt would be:
Disallow: /affiliate-link?id=*
-
Rebecca...thank you for your passionate response this is what I love most about this business we all love what we do. I agree with what you have said and of course Im not looking for proprietary info we all have our techniques but in the end its all about great technical SEO as you discussed and Increasing organic traffic...my client has a pretty established site and with ppc efforts and technical SEO all the metrics have improved greatly more important they are seeing ROI. The part about cloaking links is some great info! thanks for your response appreciated.
-
This post is deleted! -
Everett
Perfect I see you got the gist of my conversation. What im going for is trying to keep the focus on the static and dynamic content while keeping the fluid content from bloating search results in the end I believe this is of benefit to search engines so they can separate the meat from the bone and I dont expend crawl budgets on thin content.