Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
Cross Domain duplicate content...
If the alternative is just de-indexing those duplicate pages on one website, then I'd definitely recommend the cross-domain canonicals, yes.
| BradyDCallahan1 -
We are looking for an SEO Company to hire to assist in upgrading a site to Wordpress
Bottom of this page click on the Recommended Companies....that's the best advice here!
| JVRudnick0 -
Landing pages "dropping" and being replaced with homepage?
Hi Richard Good shout, after doing some research it looks as though the page that has now been replaced with the homepage actually got 100% CTR for a search term that isn't really relevant to our service. Only 1 click though. However the relevant searches did get a good few impressions but didn't get a single click. I guess that is going to have something to do with the sudden change. Seems a bit harsh to take action with only 1 click ...
| SanjidaKazi0 -
Adjusting Display of Sitelinks
Thanks! This is pretty much what I thought but thought I would check. Google is using all Caps for some words in other Site Links so my guess is it is somewhere on the site in the anchor text. Much Appreciated!
| DRSearchEngOpt0 -
Cookieless subdomains Vs SEO
Sounds like you should be all set! Thanks for the very kind words, too, Aires Let me know if I or Moz can ever be helpful.
| randfish0 -
Robots.txt
You may be better off just doing a pattern match if your CMS generates a lot of junk URLs. You could save yourself a lot of time and heartache with the following: User-agent: * Disallow: /*? That will block everything with with a ? in the string. So yeah, use with caution - as always. If you're quite certain you want to block access to the image sizes subdirectory you may use: User-agent: * Disallow: /sizes*/ More on all of that fun from Google and SEO Book. Robots.txt is almost as unforgiving as .htaccess, especially once you start pattern matching. Make sure to test everything thoroughly before you push to a live environment. For serious. You have been warned. Google WMT and Bing WMT also provide parameter handling tools. Once you tell Bing and/or Google that you want their bots to ignore urls with certain parameter(s) you select. So if you wanted to handle it that way, it looks like ignoring the app= parameter should do the trick for most of your expressed concerns. Good luck! explosions in the distance XD
| Travis_Bailey0 -
Will merging sites create a duplicate content penalty?
Thanks for the answers folks.
| boballanjones0 -
Doing large scale visual link/content analysis
Looking at a screenshot of a website is a very poor way to determine content quality.
| Kingof50 -
Language/Country Specific Pages All in English
Thanks Keszi, Will send you a PM, appreciate your help and advice. Thanks. Gareth
| PurpleGriffon0 -
Pages that 301 redirect to a 404
I agree with Dave. When the page redirects from a page to a 404 it's another redirect however the url mentioned does not exist so there should not be any links to it within your site. If a competitor is linking to a non existing page it's referred to external and as long as it stays like that then you should be just fine. The problem starts when you start linking to a non existing page. So keep it clean and implement the change I would say.
| JarnoNijzing0 -
Best anchor text strategy for embeddable content
SamuelScott is 100% right, I only wanted to add, that we should stop thinking about the anchor. It is allways manipulation in the room, when we think about anchor. Thats my opinion.
| paints-n-design0 -
Making unresponsive site responsive, should I expect any ranking penalties?
I'd recommend you run some A/B tests. In our experience with site revamps, it's best to ensure your visitors are happy first, whcih you can measure using conversion rates. If you notice a big dip in conversions, something may be broken which could then have a knock-on effect with your rankings. From a massive revamp we recently did to make things responsive (1.5M pges) we noticed an increase in most of our rankings. But just remember - treat google as a mobile user, so if you start simple and progressively add more content via ajax as the screen size increases, remember google may not pick these up, it will see your 'simplest' page, so make sure you don't remove any important on-page factors.
| benseb0 -
Google Page Speed Score 91, But 5-8 Seconds to Download URL
Hi, What is the % of mobile visits - could you check the page speed within Analytics with segment 'Mobile'? I prefer to use webpagetest.org rather than pingdom - because it seems to give more realistic results (pingdom always seems to load faster than webpagetest). On your desktop version everything seems ok - site loaded in 2.3 sec - images are very heavy though: http://www.webpagetest.org/result/150103_1M_SYT/ Different story on mobile: http://www.webpagetest.org/result/150103_CT_T0H/ - initial page load = 14sec - mainly because of the images (900K). I would check if these images could be compressed - or remove the slider and replace it by 1 image. rgds, Dirk
| DirkC0 -
Some site's links look different on google search. For example Games.com › Flash games › Decoration games How can we do our url's like this?
I believe he is talking about Breadcrumbs - not sitelinks. To get your breadcrumb rich snippets displaying in the SERPs they must use the appropriate micro data markup. Some examples on how to implement this can be found here: http://builtvisible.com/micro-data-schema-org-guide-generating-rich-snippets/#breadcrumb And here: https://support.google.com/webmasters/answer/185417?hl=en Until recently, these breadcrumb links were clickable from the SERPs but like a lot of things in SEO, this was exploited to do things that it wasn't intended to do so Google removed this feature. You can read about clickable breadcrumb links being removed and why, here : https://www.seroundtable.com/google-breadcrumb-snippets-drop-hyperlink-19595.html
| davebuts0 -
WordPress Duplicate URLs?
My developer claims that this is a common — and natural — occurrence when using WordPress WordPress does tend to create duplicate pages. However, they can be easily managed/prevented. and that there's not a duplicate content issue to worry about. Is this true? You definitely need to worry about duplicate content - especially when there are auto-generated pages, category pages, author pages, and tag pages. In your site the /authors/fear URL has "http://www.quotery.com/topics/fear/" set as the canonical URL. So, the developer did add a correct canonical setting to try and prevent duplicate content issues. However, a canonical setting is only a suggestion for the search engines and they do not necessarily adhere to the settings. Therefore, since the pages seem identical, I suggest noindexing the /authors/fear page.
| Ray-pp1 -
Blog Content In different language not indexed - HELP PLEASE!
Ha, it happens. Every once in awhile I do a full audit on our thousands of clients. I find the strangest things - meta robots blocked, robots.txt blocked, client releases new site, doesn't tel us, no onsite done at all, homepage titles set to "Home." It happens - things get lost in the mix. Plus, most people deal with robots.txt and meta robots (at least more frequently). The simplest thing for future reference is just to drop it in Screaming Frog (even the trial) and just see what you see.
| MattAntonino0 -
Block web archieve/way back machine
You can block Wayback Machine from crawling and creating a record of your site by adding the following to your Robots.txt file: User-agent: ia_archiver Disallow: / This will not only stop new records from being created but also stop people viewing what had previously been indexed by Wayback Machine. More information about this can be found here: https://archive.org/about/exclude.php
| davebuts2 -
Moving to https: Double Redirects
No problem - what was you reason for going to HTTPs? If it is for a ranking boost you may find this an interesting read - http://blog.searchmetrics.com/us/2014/08/29/https-vs-http-analysis-do-secure-sites-get-higher-rankings/ I moved one site to https and saw no siginificant ranking boost from this change. I have seen people implement this and cause there site to slow down which has had the opposite effect. In my opinion another big clue is the fact that sites like Moz and Search Engine Land haven't gone to https
| Matt-Williamson0