Questions
-
Http v https Duplicate Issues
Since HTTPS is now a ranking signal, it is better to use the HTTPS version as the canonical. I would personally make every page of the site HTTPS via 301 redirections (or rel=canonical but those can be trickier to implement). http://site.com --301--> https://site.com http://site.com/page1/ --301--> https://site.com/page1/ etc. This may require a few changes to the site (internal links shouldn't have unnecessary redirections, adding the HTTPS site to Search Consol (webmaster tools), etc.) so make sure you look around for resources on migration. If you decide to keep HTTP only, do not noindex or disallow HTTPS because you may have valuable links pointing to HTTPS which help your ranking.
Technical SEO Issues | | AxialDev0 -
Spammy Inbound Links
Hi there I would recommend checking your internal links and sitemap to make sure there are no links pointed to the subdomain. These can be quickly found doing a ScreamingFrog crawl. From there, I would conduct a quick backlink audit to see if there are any backlinks you may have that are pointed to the subdomain that can be removed or at the very least, disavowed. You may also find good, quality, and relevant links that you want to keep - feel free to reach out and update those to point to the domain, or whatever URL is relevant on your main site. If the subdomain is in fact done, over, and gone, you have the opportunity to noindex it, block it with robots.txt, and you can ask that Google removes it from their index as well. You can also look into a 410 status code, which tells crawlers this page is never coming back, ever. You'll need to discuss all of these options with your team of course and weigh the options! Hope this helps! Good luck!
Intermediate & Advanced SEO | | PatrickDelehanty0 -
Disavow to all domains?
It's best practice to verify all versions of your site in Webmaster Tools (now called Google Search Console) and submit your disavow file to each of them. Most likely, though, if you only use one version (i.e. if all of your urls are https://www.... and all other versions redirect to this one) then submitting to that version should be enough. But, there's no harm in being on the safe side and submitting the file to all versions.
Technical SEO Issues | | MarieHaynes0 -
Best UK PPC management company
Thanks for being patient while I did some digging. First, I think because you aren't a huge spender you need a flat rate. This is going to make payments predictable and you can really start to value the work being done and trust the manager of the account. Your money will go toward working for you, rather than being micromanaged at the bid level. Second, when you are calling around and making decisions about who to work with be aware of their internal structure. If you were my client at Distilled, I would be your point of contact and I would be the one working on your account. I think you should look for a company with an internal structure where you will also have this direct path for communication because you know you are getting pure, unadulterated information on your account. Recommendations:: The moment you've been waiting for! Because I think it's so important that you have a flat rate, I'd suggest GetSquare.co.uk, I spoke with a trusted, knowledgeable friend here who said "for small spend accounts we generally charge a fixed monthly management fee that covers optimisation, management & reporting." They're located up in Edinburgh, if location matters to you. Another alternative to Anicca would be Boom-Online.co.uk. I also know first hand that there are knowledgeable people here who you can trust. They are located in Nottingham. They do a review, then off the back of the review price a one time set up fee, then it's a flat fee based on spend & number of campaigns. Whatever you do, be sure you feel you are getting the most out of your preliminary conversations with any agency you are working with. Let me know if this information has been helpful, or if you have any lingering questions!
Paid Search Marketing | | JasmineA0 -
Yahoo Directory -Is it worth it?
Hi again! Despite how much people say SEO is changing, I'm not sure I would change my answer much from the one I gave just under 3 years ago. Let me update one statement at a time: I don't think that there are many categories where the traffic sent warrants $299/year any more. In fact I would say that in most cases the referral traffic from a Yahoo directory listing is close to zero. The above is still true, but even more so now. Do you get any traffic from it? As you say though, it has a lot of authority. It is also well established, tightly curated and is certainly known by those building engines. Therefore inclusion could at least been seen as a check that the site is not utter tripe. However, logically at least, a link in their directory shouldn't lend much more weight than that. Any value on this front has probably diminished even further. If it was seen as a special case that highlighted quality sites, I don't think that would be the case any more. The overall quality has diminished, great sites don't bother listing any more and Google has had another 3 years to work that out. I have not seen a measurable result from either adding or removing a site from the directory in recent years. It may well bring benefit as part of a bigger picture, who other than Google can really say? No change there. I probably haven't added a site to Yahoo directory since then though. However I wouldn't expect to see an increase from adding your site - even factoring in the additional links from directories who use it as a seed. People aren't really creating directories like that any more, and if they did then links from them would likely be more damaging than helpful. I can't imagine that anyone would use the Yahoo directory as a seed for a quality site any more anyway. It's worth mentioning that it is technically a different directory now, but I still don't think it is worth it. Wouldn't you have rather spent that last $900 on something more tangible?
Branding / Brand Awareness | | matbennett0 -
Craw Diagnostics Questions
Hi Niall, This isn't a case of the canonical tag being properly applied, but a case where two or more pages are so similar in code that they are setting off the SEOmoz duplicate content flags. First of all, those pages look different to us humans. But the SEOmoz web app uses a similarity threshold of 95% of the html code. This takes everything on the page, both hidden and visible into account. In this case, it's counting all of the navigation and sidebar as well, which is significant. What's left of the unique content - the part that matters, makes up less than 5% of the code. Here's a tool you can use to check the similarity: http://www.duplicatecontent.net/ I ran the pages through a couple of tools which showed 98% HTML similarity. And 99% text similarity. For perspective, take a look at Google's cached versions of one of these pages. This is how googlebot sees the page: http://webcache.googleusercontent.com/search?q=cache:mdybPKIjOxUJ:www.fredaldous.co.uk/craft-shop/general-crafts.html+http://www.fredaldous.co.uk/craft-shop/general-crafts.html&hl=en&gl=us&strip=1 That, as we say, is a lot of links! Since Panda, when I see a site with this many navigation links, I usually advise them to restructure their site architecture into more of a Pyramid shape, so that you reduce the overall navigation on each page. Hope this helps! Best of luck with your SEO.
Technical SEO Issues | | Cyrus-Shepard0 -
Different HTML based on resolution
SEO Wise I don't think there's any issues, though I wonder what resolution the Googlebot reports itself as having. It's important though that you do it the way Chas Blackford states; if you have actual server side code that changes a bunch of things around based on resolution then you might get in trouble. This is an interesting article about using stylesheets to segment mobile layouts (it also mentions Media Queries which are kind smart/new phone specific): http://www.alistapart.com/articles/return-of-the-mobile-stylesheet There are some implementation issues, the most important of which is reliably getting the resolution from the agent. Essentially, you can't guarantee it 100% of the time. From what I've read a combination of user agent string matching and resolution detection can probably get you most of the way though.
Technical SEO Issues | | icecarats0