Questions
-
Job Board SEO
Hello SO_UK, That is a tough question to answer without more details. Is there more than one URL for each job posting, or category page? If not, you can probably do without the canonical tags. If you want me to have a quick look, feel free to send me the link in a private message. Otherwise, I can only give the general advice that any time you have multiple URLs for the same page you should use a Rel="canonical" tag.
Intermediate & Advanced SEO | | Everett1 -
Company name ranking
Have you reviewed this? https://moz.com/blog/wrong-page-ranks-for-keywords-whiteboard-friday
Technical SEO Issues | | DonnaDuncan0 -
SEO & IFrame problem
Hi, That is a confusing problem, but it seems like most of the page content is coming from the Iframe and not the page itself. If you can I would use "No Follow" and "NoIndex" tags on the Iframe and see if that helps. If possible I would build out the content on the correct page as much as possible without affecting its usability too much. Also, you can try "rel canonical" to the correct page and hopefully that will help.
Technical SEO Issues | | Dalessi0 -
Redirection Question - Can Anyone Help?
Hi there I agree with Kevin here. Relevance is key when it comes to your users (and search engines). You could take it a step further and redirect job URLs on board B to the related job URLs on board A, that way users go directly where they need to go should they have clicked a link from a third party or another site. It might take a minute to do, but the payoff is a better user experience and organic equity being transferred correctly. Here's a tutorial on how to do a redirect map. Hope this helps! Good luck! P
International Issues | | PatrickDelehanty0 -
301 Re-directing 'empty' domains
Hello there! Were you sure that were really empty domains? Have you checked whether had history of any kind with web.archive.org? What i'm understanding from your question is that the rankings were hurt just when the redirection impacted and indexed. Solution? do not redirect those domains to that kw. Just make a 301 to the home, root domain to root domain: empty-domain.com->main-domain.com Also, have you checked for any other reason for that drop in rankings? Bad backlinks, an increase in your competitors authority, bad own optimization or better competitors's optimization, or any other? Best luck. GR.
Technical SEO Issues | | GastonRiera0 -
Simple duplicate content query
If the only change is to the URL (and job reference) and the content is otherwise the same, you'd definitely want to create 301s for these - this is exactly what 301s are for, after all. The 301s should, now, pass on 100percent of the pagerank. There are caveats (PR is not the only ranking signal), see here: <a>https://moz.com/blog/301-redirection-rules-for-seo</a> Also, think of the user experience, If I've bookmarked a job I'd expect to click the link and find it. A thought: If the new system could append the job number with a fixed value, creating a pattern match redirect rule could make easy work of this for them. Then, with your 301s in place (and confirmed to be in good order - you can run a crawl test: <a>https://moz.com/researchtools/crawl-test</a>, you can remove your old listings. Sorry for the multiple revisions of this answer, Adam; I was in the bath when I started writing it!
Technical SEO Issues | | Hurf0 -
International SEO Query
As Maurice wrote, I urge to remember that American English is different from British English, therefore it would be useful to revisit the content in order to make it fit to how English is spoken/written in the USA. That would have an impact in how Americans perceive the site, for localization and, obviously, for SEO geo-targeting. Then, yes, use the hreflang annotations in order to clearly suggest to Google what version must be shown to whom, geo-target the USA site on Google Search Console (and do not forget to do the same in Bing). Regarding what to choose between a new domain name, subfolder or subdomain for the USA website, that choice must be taken considering not only SEO factors. For instance, if your UK website is already receiving interesting volume of traffic, when you can reasonably think to go for an independent domain name (it should be a generic, because in the USA the ccTld .us is not even considered "american"), so that you could create also a very geo-targeted experience for the USA audience. If you have traffic, but not in great proportions, and if your website is not an ecommerce or one with a very complex and big URL structure and database, then you can think opting for the subfolder option. On the contrary, if your traffic from the USA is not big already and the site architecture is complex and the database very big, then it would be better going for the subdomain option. In the last 2 cases, though, the ideal would be to finally migrate the USA site to an independent domain name once the visibility in the USA, both in terms of Search and Brand recognition, has reached sustainable levels. Finally, remember that the .com is much harder than the .co.uk, so be prepared to struggle more than you have done until now for earning in the USA the same results you earned in UK.
International Issues | | gfiorelli10 -
Google displaying different meta descriptions for the same URL but different keyword
Awesome resources here from Laura on the subject - definitely check them out!
Technical SEO Issues | | PatrickDelehanty0 -
Trackback URLs & temporary re-directs
Hi Lewis, Can you share an example? Am I correct in thinking you mean external links from your blog posts (trackbacks) are going through 302 redirects? Craig
Technical SEO Issues | | CraigBradford0 -
Subpage ranking for homepage keyword
Hi Phillip, Try using the On Page Optimization tool under "Search" in your Moz Analytics account to grade each page for the term. It could be that the internal page is better-targeted for the term, even though it's only loosely related. It's also possible that the subpage has received a high number of links from third parties, although this is fairly rare. There is no chance that the home page is under any sort of filter or penalty from backlink activity (e.g. too many anchors with the target term used as anchor text, pointing at the home page?). Although also not terribly common, Google has been known to penalise pages but not entire websites, based upon the page's backlinks. You can also try linking to the home page from the subpage and including the target term in the link, but not linking to the subpage from the home page.
Intermediate & Advanced SEO | | JaneCopland0 -
Existing content & 301 redirects
Yes a toxic domain will infect a new domain if 301 is implemented. I would lean towards cleaning up the existing domain. Even if you end up disavowing every linking domain the existing domain is likely to have created more trust than starting from scratch. If it is a manual penalty ensure you document all steps to try and clean up so you can detail in the reconsideration request.
Technical SEO Issues | | MickEdwards0 -
Changing URL - Ranking Disappeared?
Right.It will take some time.Most importantly if your site is not so big it will takes more wait.MATT CUTT in a video said clearly that the 301 redirect pass the same link juice. As far as the method that you have mentioned I do not think so that you did any mistake with redirecting.
Technical SEO Issues | | csfarnsworth0 -
Website Penalised | Tips On Recovery?
You may have to get some of those overoptimized external links removed. If you can't get them removed you may have to disavow them in Google Webmaster Tools. Here are some resources for you: https://support.google.com/webmasters/answer/2648487?hl=en http://www.bing.com/webmaster/help/how-to-disavow-links-0c56a26f http://cyrusshepard.com/boom-1-email-60-bad-links-gone-4-tools-for-easy-link-cleanup/ http://www.rmoov.com/index.php http://moz.com/blog/google-disavow-tool It sounds like your site is being filtered just at the keyword level, which is a good thing. If you had a site-wide penalty for spammy links it would be much more difficult to get your rankings back. Just work on making the old links more natural-looking and removing or disavowing any that you can't make more natural-looking that seem spammy or possibly paid-for. Be ruthless in your culling of these links and you should bounce back within a couple of months.
Search Engine Trends | | Everett0 -
How can I recover from an 'unnatrual' link penalty?
Hi Lewis No, you will be fine if you amend the anchor text, I would just make sure that you change the anchor text to a more 'long-tail' phrase. For example, instead of 'roller banners' you could change to something like 'for cheap roller banners in the UK' - main thing is, you get keyword in there padded by generic text. Hope this helps
White Hat / Black Hat SEO | | PIXUS0 -
Page 1 Ranking - Disappeared!
Hi Lewis, I'm just checking in to see if you saw John's response. Please let us know -- we're here to help! Thanks.
White Hat / Black Hat SEO | | Christy-Correll0 -
Development Website Duplicate Content Issue
Hi Lewis To be honest, you've done everything absolutely right. The only other thing I could suggest would be to add to the head of those pages, if you haven't already. But from everything you've said and having tested it myself, it doesn't look as though Google now has any chance of reading duplicate content from the dev sub-domain. Have you received a warning in Google webmasters? Have you submitted a reconsideration request if you think you have been penalised? There shouldn't be any duplicate content issue now that is affecting the live site's ability to rank. Panda refreshes may take a while, if indeed you have been hit at all, but you've done everything here spot on in my eyes. With that in mind, I'd focus on building quality content and links to your live site, in order to try and rank it. I think you just need to show a bit of blind faith in Google to sort this out themselves.
Technical SEO Issues | | TomRayner0 -
Development Website Duplicate Content Issue
Glad that helped, Lewis. Unfortunately, there's really no way to determine how long the 301-redirect process will take to get the URLs out of the SERPs. That's entirely up to the search engines and I've never seen much consistency to how long this takes for different cases. One other thing you could do to try to help speed the process is to add an xml sitemap to the dev site, and verify it in both Webmaster Tools. (Only do this AFTER you have added the metarobots no-index tag to the remaining pages headers!) This will help remind the crawlers of the dev pages, and hopefully get the crawlers to visit them sooner, thereby noticing the redirects and individual no-indexes, and taking action on them sooner. Personally, I'd let the process run for 2 or 3 weeks after the dev pages get re-indexed without the robots.txt. If the pages are gone, job done. If not, at that point I'd re-evaluate how much damage is being done by still having the dev site in the SERPs. If the damage is heavy, I'd be seriously tempted to use the URL Removal Tool in Bing & Google Webmaster Tools to get them out of the results so I could move on with building the authority of the primary domain (even though that would throw away the value the dev pages have built up). REMEMBER! Once you've removed the robots.txt no-index, the metatitles and especially metadescriptions of the DEV site are what will, at least temporarily, be showing in the SERPs once the pages get re-indexed. So make certain they have been fully optimised as if they were the real site. That way at least in the near terms you'll still be attracting good traffic while waiting for the pages to hopefully drop out. This may allow even the dev pages to do well enough at bringing traffic that you can afford to wait until they drop out naturally. **As far as seeing the additional 70 or so pages that are indexed, as Dan says, at the bottom of the search page is this paragraph and link: _In order to show you the most relevant results, we have omitted some entries very similar to the 3 already displayed. If you like, you can repeat the search with the omitted results included. _ When you click on that link, you'll see the additional pages. This is called the supplemental index and usually means these pages aren't showing up very well in the results anyway. Which means that for most of them, it will sufficient to make sure you've added the metarobots no-index tag to their page headers to just get them removed from the index to avoid future problems. Does all that make sense? Paul
Technical SEO Issues | | ThompsonPaul0