Questions
-
Gallery maintenance and the effect on SEO
Hi there. A bit difficult to understand your actual question, but as far as I see it, you are asking if you should remove any unrelated and therefore unapproved images from the backend of the gallery. And if that would have any SEO effect. Am I correct? If so, then I see couple potential benefits of removing those photos from the backend - first, just for the mental health of administrator - I'd go nuts if I had to scroll through a bunch of old junk every time I have to moderate new photos. Second, depending on how your frontend and backend are connected and process things, it, in fact, might speed up the process of rendering the gallery on the frontend. Faster website = happier users = more conversions. Cheers.
Intermediate & Advanced SEO | | DmitriiK0 -
Sitemaps: Best Practice
Pages that I like to call 'core' site URLs should go in your sitemap. Basically, unique (canonical) pages which are not highly duplicate, which Google would wish to rank I would include core addresses I wouldn't include uploaded documents, installers, archives, resources (images, JS modules, CSS sheets, SWF objects), pagination URLs or parameter based children of canonical pages (e.g: example.com/some-page is ok to rank, but not example.com/some-page?tab=tab3). Parameters are additional funky stuff added to URLs following "?" or "&". There are exceptions to these rules, some sites use parameters to render their on-page content - even for canonical addresses. Those old architecture types are fast dying out, though. If you're on WordPress I would index categories, but not tags which are non-hierarchical and messy (they really clutter up your SERPs) Try crawling your site using Screaming Frog. Export all the URLs (or a large sample of them) into an Excel file. Filter the file, see which types of addresses exist on your site and which technologies are being used. Feed Google the unique, high-value pages that you know it should be ranking I have said not to feed pagination URLs to Google, that doesn't mean they should be completely de-indexed. I just think that XML sitemaps should be pretty lean and streamlined. You can allow things which aren't in your XML sitemap to have a chance of indexation, but if you have used something like a Meta no-index tag or a robots.txt edit to block access to a page - **do not **then feed it to Google in your XML. Try to keep **all **of your indexation modules in line with each other! No page which points to another, separate address via a canonical tag (thus calling itself 'non-canonical') should be in your XML sitemap. No page that is blocked via Meta no-index or Robots.txt should be in your sitemap.XML either If you end up with too many pages, think about creating a sitemap XML index instead, which links through to other, separate sitemap files Hope that helps!
Intermediate & Advanced SEO | | effectdigital0 -
Regex in Disavow Files?
Hi Fubra, You can disavow at a domain level, so no regex is required (and I don't think it will work). Just add "domain:" before the domain, eg. domain:spammysite.com Marie Haynes wrote a good guide to using the disavow tool here if you need any further information: https://moz.com/blog/guide-to-googles-disavow-tool Cheers, David
Intermediate & Advanced SEO | | davebuts0 -
PPC: how to get rid of an ad appearing on a keyword we don't want?
Hi Fubra, First thing that I can recommend that you do is to go into your AdWords account, then into the campaigns, then into the tab titled 'Keywords'. Once there, click into the tab titled 'Search Terms'. Once in the 'Search Terms' tab, you should see all the keywords that triggered your ad to show. In this tab, look for the specific keyword that is unwanted. Select this keyword by clicking on the selection box on the left and you should see a bar pop up that allows you to select what you want to do with this keyword. Select the option 'Add as Negative Keyword' and that should stop the ad from triggering each time someone searches for the unwanted keywords. Pro-tip: As part of your PPC optimization process, you should also be looking in the 'Search Terms' section of your campaigns/ad groups on a regular basis to see exactly what search terms are triggering your ads to show. There's bound to be search terms that are non-converters and also irrelevant search terms. By continually adding those to your list of negative keywords, you make your PPC campaigns more cost-effective. Good luck!
Paid Search Marketing | | NgEF1 -
PPC: How do we get our reviews into AdWords?
I can't think of another way. They were great a & solid increase in CTR. However, was a pain to get approved.
Reviews and Ratings | | KevinBudzynski0 -
Fixing 404s
I'm getting into the habit of fixing them where i can - if the page is linked from anywhere and fixing/ redirecting if i can. Then i'm marking them as fixed in search console repeatedly - any pages I'm very worried about, I recrawl via the 'fetch as google' with the hope that google will recognise them as gone. On reading google documentation, it says that 404s won't necessarily harm your site and 'in most cases' should be left to 404. so basically i'm trying to use the tools available to us to read what errors might be there and fixing the ones that are worth fixing. I know what you mean about having loads of frustrating 404s - as an seo trying to find things to fix, it's an absolute nightmare. but try marking them as fixed and recrawling them if necessary? perhaps also - since your site is ecom, making a canonical campaign whereby you can canonicalise the most up-to-date product and maybe creating a custom 404 page. like "this is a 404, but here's some similar products you can browse, <a>red shirt</a>, <a>blue shirt</a> etc. hope that helps?
Search Engine Trends | | Fubra0 -
410 or 301 after URL update?
Yeah, of course I can explain more. HTTP 410 status code tells google that you've eliminated that page and will never be live again. So google will kill that URL in its database and never ever crawl it again. Thus said, GoogleBot follows assumptions that site is working poorly or that there is some big problem. How can this happen? when you have a massive amount of 404, 301 redirects, 410, 5xx you might have your site downgraded, possible deindexed, reduced bot crawling frequency or any other penalty you might imagine. Some info about 410 status code: HTTP/1.1 status code definitions Hope it helps. Best luck. GR
Intermediate & Advanced SEO | | GastonRiera0 -
Changing sitemaps in console
Thanks Gaston. Have discovered the check-box and delete option on console so thankfully i have managed to delete the old ones. Amended the new ones as you recommended. Thanks for your advice, we'll keep an eye on the results to see what happens now!
Intermediate & Advanced SEO | | Fubra0 -
Putting nav code at the bottom of a page?
I have placed the nav code at the bottom of the HTML doc at times. I can't really say that it is a significant difference for SEO. It doesn't take a lot of work to do if you are skilled with HTML/CSS but I can't really say that this methodology will have a long term benefit for SEO. HTML 5 has new tags that sites should adopt such as <nav>and other tags to indicate what that chunk of content is. These tags are supported by all major browsers at this point. I don't know all the specific browser versions. I would recommend this moving forward where possible. By using this tags the crawlers likely will not factor in position in the document to understand the importance of chunks of content.</nav>
Technical SEO Issues | | bloggidy0 -
Will a Media Wiki guides section affect my sites SEO?
I do, I run a pretty large wiki (2400 articles, over 1000 users) and I can definitely vouch for it. Out of the box it uses canonical tags as well (for redirect pages), knows which pages it should index and which it shouldn't (edit & login pages for instance are noindexed) and Google can handle an out of the box install pretty well. If needed you can even use one of the meta plugins to optimize your descriptions, titles and noindex/nofollow certain pages or even namespaces. (I wrote the AdvancedMeta plugin, but there's plenty of others).
On-Page / Site Optimization | | StephanM0