Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
Google Change of Address with Questionable Backlink Profile
Thanks Ryan, that's definitely where our focus is, especially given that resources are limited for the time being.
| LukeHardiman0 -
Google News URL Format
Hi all, Is it still the case that you can submit EITHER with 3 digits in the URL OR via a news sitemap? I can't see anything in the official instructions about the sitemap route... they seem pretty insistent on the 3 digit rule though. Can we do it just by submitting a news sitemap via GWT? Do you still have to go through the inclusion process here: http://support.google.com/news/publisher/bin/bin/static.py?hl=en&ts=2394225&page=ts.cs&from=191208 Thanks guys... MB.
| MattBarker0 -
Duplicate Content Question (E-Commerce Site)
_Ideal solutions would be using Meta refresh tag but the problem is search engines are against this practice. So, as things stand, you have to use 301 redirect to fix this issue. I know there is a possibility that ranking may suffer a bit. So, here is the suggestion - What about making a bit changes in the body content to use this page as a landing page for a new product. I hope you just have to make some changes in the specification or the title will have a slight modification. If you can manage to do this, you will be in a win-win situation. Let us know what you think about this. _
| Debdulal0 -
Using Rel=Author with Multiple Contributors
Oh man that is too bad, I was really hoping for a authors page versus individual pages.
| PLEsearch0 -
Should I no follow all external links?
That is true. If you nofollow a link, nobody gets the PR.
| EGOL0 -
Standard Responses Causing Duplication Issues
_Rather than copy and pasting the same thing over and over again, you can create a FAQ section in your website and each answer should have a dedicated landing page. Once done, whenever you see someone has posted a question that matches with FAQ section, you can place the URL and ask the asker to gather more information from there. _
| Debdulal0 -
Client with Very Very Bad Onsite SEO
Let me explain you with the help of an example. Let's suppose, if you meet a person. Apparently, he looks really good. He is well dressed, healthy, smart and gives a first impression of being perfectly fine. You will notice that not only you but other people are also attracted and feel good to be associated with him just like you. He is attractive enough to make everyone his admirer BUT, from deep inside, he is completely shattered and mentally upset. There is a lot of destruction going on within himself but he is not showing it off to other people. From outside, he is great but from inside, he is broken. Do you think that person can even think of becoming someone friend and help him/her out of his problems? Can you think of any kind of support from him coming in to you? Definitely not! He himself in desperate need of help to be repaired and integrated. He is good for nothing but yes, seemingly, you may think absolutely the other way but the fact is you are being cheated. A good disguise doesn’t necessarily have to be true. Your client’s website has exactly the same story. It looks good from outside but technically, it’s broken. I’ll advice you to discuss everything openly with your client. I am afraid you are going to lose your repute and goodwill at the end of the day.
| earlyadopter0 -
404 vs 301
A 301 redirect is generally the preferred option to keep credit for any links you have gained to those pages. You can check if the page has links in Open Site Explorer but it doesn't always find every inbound link so there might be inbound links you don't know about. My suggestion would be to create a page that explains that the product line has been discontinued with links to recommended alternatives. Then 301 redirect the old pages to this new page. Otherwise customers might be confused as to why they are being redirected to your homepage and bounce.
| Charlessipe0 -
What is "evttag=" used for?
Well, it's certainly not an SEO thing. In fact, it's not even a valid HTML thing; here are all of the craziest, acceptable attributes for a DIV in HTML5: http://dev.w3.org/html5/spec/single-page.html#the-div-element It among many, many other things, is breaking realtor.com's code through the validator: http://validator.w3.org/check?uri=http%3A%2F%2Fwww.realtor.com&charset=(detect+automatically)&doctype=Inline&group=0 In my experience, this one of the easiest ways to assure that Google can't crawl your site well, or put confidence in your user experience. Other crazy new standards, like schema.org do get pretty creative with structured data, but you'll notice that those use itemscope/itemprop/itemtype .... which do still pass the W3 validator: http://schema.org/docs/full.html In addition, there's no reference to an outside standard, even one that's totally foreign to what's generally accepted web development (similar to say, Facebook+OpenGraph, which is sometimes used, like this): xmlns="http://www.w3.org/1999/xhtml" xmlns:fb="http://www.facebook.com/2008/fbml" xmlns:og="http://opengraphprotocol.org/schema/"> I award realtor.com's web developers 0 points, and may the flying spaghetti monster have mercy on their SEO's. To put that another way: don't copy this.
| CoreyNorthcutt0 -
Hosted Wordpress Blog creating Duplicate Content
Hi Jarno, I'm going to install Wordpress on our own server soon so we have more control and can add SEO plugins. Don't know if that will clean up the mess already created, though. Thanks, Tom
| TomHu0 -
Host sitemaps on S3?
My general take on this sort of scenario is first to eliminate all the redundant hostnames with round-robin DNS, through adding extra server power with software-based load-balancing in the interim with a solution like InterWorx, and breaking out database servers. If you do that, you should have a nice little server cluster that's crazy efficient.and scalable. You can add a CDN to the mix if you like as well. With all of that, SEO should work the same way as on a single server. Sitemaps can then be generated dynamically really easily (in under 25 lines of code, most of the time). If you just want a way to mirror static files, you'll want to look at rsync. And finally, as for S3, my personal opinion is to stay away. I'm an SEO, but I also spent 7 years building a hosting company. Those solutions sound great in their marketing, but are scientifically less reliable than standard hosting, and you can verify that via public uptime tracking sites like HyperSpin.
| CoreyNorthcutt0 -
Diagnosing Canonical Errors Is Screaming frog reliable?
Hey, Long time since the Question, I was just wondering if you worked it out or not. Gr., Istvan
| Keszi0 -
Google Rejects Merchant Feed
I'm just going to share that Google has specifically mentioned to me there's issues on their end with some of these tools. That doesn't make it any easier when you are on the receiving end, however they are aware of the impact and user issues.
| josh-riley0 -
Form output
I typically noindex these. You would only use a canonical if the two pages have identical content
| eli.boda0 -
Verifying hreview reviews
Hello, To get the desired results you must choose which format you can use easily with your code. As Google support various formats for rich snippets like Microformats, RDFa & Microdata. Here's Example Goes of Microdata: Blast 'Em Up—Game Review Rating: 88 out of 100 based on 35 reviews. Use this tool to verify your code by entering it in HTML Tab, either rich snippets are working or not. http://www.google.com/webmasters/tools/richsnippets You should use any star rating script (10 stars or 5 stars) and on rating a comment box appear where visitor / user can put his/her review. You can find more about such formats on Schema.org Here's Google Tutorial for guideline: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=172705&topic=1088474&ctx=topic
| KLLC0