Questions
-
URL rewriting causing problems
You can write canonical tags dynamically, but you need to canonical to the product specific page. Obviously, don't canonical every details.php page down to one. It could look something like this: $prodDesc = "games-playstation-vita"; $prodCode = "B0054QAS"; $prodURL = "http://www.mydomain.com/".prodDesc."/".prodCode; ?> I assume that the product description and code are generated from a database, so they should be available somehow to the header.
Technical SEO Issues | | Dr-Pete0 -
Can a disapproved adwords ad affect SEO ranking?
No, the only reason Adwords can have somekind of an impact on Organic is when people become aware of your site and search it later using it's name. Beside that, there is no link whatsoever between Adwords and Organic. Maybe .. Maybe if your landing page or site has a problem the Adwords Bots will report it to some team at google, but if it's the case, Googlebot would spot it some time around ... So which one did it doesn't have this much of importance.
Affiliate Marketing | | Catalyste0 -
Adwords kept getting disapproved
I am not an AdWords expert. I suggest possibly going to the AdWords help forums and asking there, where you'll find a few more experts than you might here, and possibly someone from Google themselves will step in. Also, as Gary suggested, give them a call.
Affiliate Marketing | | KeriMorgret0 -
Restricted by robots.txt does this cause problems?
Hello Ocelot, I am assuming you have a site that has affiliate links and you want to keep Google from crawling those affiliate links. If I am wrong, please let me know. Going forward with that assumption then... That is one way to do it. So perhaps you first send all of those links through a redirect via a folder called /out/ or /links/ or whatever, and you have blocked that folder in the robots.txt file. Correct? If so, this is how many affiliate sites handle the situation. I would not rely on rel nofollow alone, though I would use that in addition to the robots.txt block. There are many other ways to handle this. For instance, you could make all affilaite links javascript links instead of href links. Then you could put the javascript into a folder called /js/ or something like that, and block that in the robots.txt file. This works less and less now that Google Preview Bot seems to be ignoring the disallow statement in those situations. You could make it all the same URL with a unique identifyer of some sort that tells your database where to redirect the click. For example: www.yoursite.com/outlink/mylink#123 or www.yoursite.com/mylink?link-id=123 In which case you could then block /mylink in the robots.txt file and tell Google to ignore the link-ID parameter via Webmaster Tools. As you can see, there is more than one way to skin this cat. The problem is always going to be doing it without looking like you're trying to "fool" Google - because they WILL catch up with any tactic like that eventually. Good luck! Everett
Technical SEO Issues | | Everett0 -
Adding rel=next / prev to pagination that uses Ajax?
It's tricky - these are http://googlewebmastercentral.blogspot.com/2011/09/pagination-with-relnext-and-relprev.html To steal their example, page 2 of search results is going to need 2 tags, like this: If there's no page refresh and you're using AJAX, it's possible your paginated search isn't being indexed at all. It depends on what you're showing to bots. Unfortunately, that's nearly impossible to tell without seeing the site.
Intermediate & Advanced SEO | | Dr-Pete0 -
Pages that were ranking now are not?
How can you check if your site is in the google index and how many pages of the site have been indexed?
Intermediate & Advanced SEO | | ocelot0 -
Page and domain authority?
Hi Thomas Yeah it is new so I am working hard on its visability. Any ideas why the other 2 websites are ranking higher than me for the same search term as described in the 1st post? are they better optimised onsite which I dounbt and offsite and social they are quite weak?
Technical SEO Issues | | ocelot0 -
Facebook Like Quick Question
I have it set-up to like my pages in the footer of the footer of the website but then on individual pages (product pages) this likes the page so that it is share though their facebook.. Id rather them like my page then i can maybe push more content through to users in the future.
Social Media | | Lantec0 -
Duplicate Content from SEOMOZ report?
Hi Ian, Sorry tried an online tool, however it is not valid. Best practice method of checking 404 errors is through Google Webmaster Tools or scanning your site through Xenu (but it might take a while). There might be several reasons why those pages are appearing as 404's Has there been dropouts on your sites server lately? It might then be an accessibility issue. There might be broken links (as we've discussed) that is causing these 404 errors. People entering in weird search queries and the fact that the URL is dynamic is indexing them in search results. Have a look, if you have further issues dont hesitate to contact me directly. Thanks, Vahe
Technical SEO Issues | | Vahe.Arabian0 -
Sitemap Creation
If you purchase the paid version ($20) you can create an XML sitemap for unlimited number of pages - http://www.xml-sitemaps.com/standalone-google-sitemap-generator.html Also you can use other software i.e. a1 sitemap generator or crawler based software such as Xenu or Screaming frog to create your XML sitemap. However my recommendation would be for you to create individual XML sitemaps per category. Whilst it is not necessary, it will improve the likeliehood of search crawlers better understanding your sites hierarchy, according by product platform. This will help search engines index the maximum number of pages possible. Don't forget to also create an image sitemap. Hope this helps, Vahe
Technical SEO Issues | | Vahe.Arabian0 -
New Site Search Critique
Re: item 1 - the answer can be found in the first response to this Q&A: http://www.seomoz.org/q/i-need-help-with-htaccess-redirect If you would like to learn more about htaccess, an excellent resource is here: http://net.tutsplus.com/tutorials/other/the-ultimate-guide-to-htaccess-files/ Re: item 3, no damage Re: item 7 - I did share a suggestion. I prefer to use the home page to target the site's brand. I use inner pages which are 1-click from the home page to target the site's main keywords. Your site can target 100s of keywords so unless one particular keywords is highly dominate over the others, this approach is my preference. Re: item 10 - no. The W3C is the governing body for HTML/CSS code. I was referring to using their validator tool to ensure the code used on your site is valid. Re: sitemaps, there are numerous software apps available to create a sitemap. Find one you like and set a CRON job to run as needed to generate a new sitemap.
Technical SEO Issues | | RyanKent0 -
Canonicalisation - Best Approach?
If what you are trying to achieve is to have a canonical that is the www version, you are better served doing two things: First go to Google webmaster tools and select a preferred domain (from GWMT): To specify your preferred domain: On the Webmaster Tools Home page, click the site you want. Under Site configuration, click Settings. In the Preferred domain section, select the option you want. You may need to reverify ownership of your sites. Because setting a preferred domain impacts both crawling and indexing, we need to ensure that you own both versions. Typically, both versions point to the same physical location, but this is not always the case. Generally, once you have verified one version of the domain, we can easily verify the other using the original verification method. However, if you've removed the file, meta tag, or DNS record, you'll need to repeat the verification steps. Note: Once you've set your preferred domain, you may want to use a 301 redirect to redirect traffic from your non-preferred domain, so that other search engines and visitors know which version you prefer. To do the 301 redirect of the non www to the www here is the script from Scriptalicious: **RewriteEngine On RewriteBase / RewriteCond %{HTTP_HOST} ^yourdomain.com [NC] RewriteRule ^(.*)$ http://www.yourdomain.com/$1 [L,R=301]** Hope this helps you.
Technical SEO Issues | | RobertFisher0 -
SEO and Pagination on search results
Doug is correct, the best practises is to not use ajax on load, only for futher requests.
On-Page / Site Optimization | | AlanMosley0