Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
What is the effect on using jQuery sliders for content on SEO?
Thanks Ryan! Great answer.
| peb72680 -
Ad units or % of ads vs content?
Keep in mind that above the fold you have your site logo, persistent navigation, search box and other assets. I think that your ratio should be considered after the above is subtracted out. After that it depends upon presentation. If the visitor is slapped in the face with two big ad blocks and has to scroll down to read content then that is too much. However if ads are on the edges of the site and content is front an center than that is a lot better. Ad format can be considered too. If your site is heavy with images then big image ads can compete with the content. My goal is to be sure that content is the most obvious thing within the design of the site and ads are subordinate. Would people want to link to ads or link to content. Make sure that your presentation makes them linking to content. I have a high pagerank blog and link to informative sites every day. I stopped linking to a couple very high profile sites that are household names when they started opening popups on my visitor and having ads more obvious than content on their pages.
| EGOL0 -
How can we minimize the SEO impact during a major platform migration?
Hey thanks. is there some tool or feature in google webmaster account that shows us info about the caching frequency? also, we only set up 301's for 400 of the 8,000 urls. only those 400 hand external links to them. it would be a massive project to redirect all those other 7,600 urls. do we need to do that?
| outofboundsdigital0 -
Is 404'ing a page enough to remove it from Google's index?
Nice information John. I hadn't thought of adding a temporary page with a noindex tag but that sounds like a way to go for faster results. I know Google has automatically removed 404 pages in the past. I noticed the issue Michelle is sharing and the information you shared offers great details on the process.
| RyanKent0 -
Setting a 404, best practices
If for any reason a web page is not found your web server will return a 404 http header code. Wikipedia offers a very readable status code list: http://en.wikipedia.org/wiki/List_of_HTTP_status_codes The challenge you are facing is Google does not automatically de-index a link due to receiving a 404 error. At times Google might crawl a site and there can be an issue where a page is not available, or even your whole site. In my opinion the best way to handle that is for Google to wait until they have received a 404 response a few times, then remove the page. Make sure your page is not blocked by robots.txt or else Google will never be able to see the page and know it is 404'd.
| RyanKent0 -
Should I Host New Blog On Different Server?
I don't think you can out smart Google by putting websites on various sub nets for hosting, you really need to think about all levels how Google can track for example who is data, GA tracking codes, Adsense ID codes and thing that can link up the dots to show it is a mini net. If you have a blog and the same site my advice is to host it all on one domain, sure enough you can host the site on 2 separate websites if you are using different TLD's and want to GEO target your website.
| JamesNorquay0 -
What should my optimal anchor text look like, given cannibalization risk?
I am not certain of the mechanics of exactly how the tool works. Please change the page title as I suggested then you and I can try running the tool again and see if the results change. The words in your title are supposed to indicate your targeted keywords. All those extra words could be confusing the tool.
| RyanKent0 -
Get-targeted homepage for users vs crawlers
Your most recent Google cache is showing the target as Toronto,Ontario confirming your concerns. The challenge you are facing is similar to sites like Yellowpages.com, superpages.com and any other location targeted sites may face. The way these sites have worked around is by adding relevant static content to the website. By adding relevant content to your home page your will be able to minimize the location targeting effect. Plus, search spiders already know that you are not limited to the location of IP because they have indexed multiple pages from your website. I also noticed that your home page outbound link to content ratio is upside down. You have too many links on the homepage and extremely limited content. Additionally, the text blurb on the top is not user friendly and stuffed with too many keywords. You can easily replace this text with user friendly static content such as your website USP.
| ninjamarketer0 -
Help with canonical tag
Yes as long as the "www" is included in the URL of the canonical tag.
| RyanKent0 -
Ideas on Best way to move microsite content but preserve the microsite.
Think of URLs are formed in the following pattern: AAA.BBB.CCC.DDD When you see you URL as 100.200.222.050 then your C block is "222". Each domain company usually is assigned 1 C block for all their servers. So yes, you would need to use a different company for your second site.
| RyanKent0 -
Duplicate Content across 4 domains
EGOL - thanks for advice. Yes there are lots of ilinks between the domains. Although I also have clients who have done this deliberately for perceived gain, I think in this case the client has made an honest mistake by simply applying their CMS (with relative links) to each of the domains they thought they should purchase. It's confused a bit further by one section using https with an absolute domain so users can end up migrating from one domain to another and from http to https! As an SEO I also have inclination to 301 page-by-page to best ranking site. However, as I mentioned to Thomas (above) I think the client will probably want to go with their preferred domain and as such I'll 301 page-by-page to that one. I'll discuss with the client and post the outcome.
| bjalc20110 -
With Panda, which is more important, traffic or quantity?
I would focus on the thin content that accounts for 50% of your traffic. From what I've seen Panda update may consider what visitors do when they get to your site, ie; high bounce rates, low page views, etc can negatively effect rankings. Focus on the pages that already get traffic and improve their experience and Google will reward you.
| iAnalyst.com0 -
Optimising My Website Link Containers
Hi there a quick reply from me 1: if I were you I would make my links in text and not in images. If you do have to add images avoid names such as read more. There are Java Script tricks you could use if you must have images. Such as replacing text with an image on "dom ready" there is no reason to show google an image if it's only a "read more" button anyhow. 2: If I make a list of links, make it a list, don't do divs if a list makes more sence eg: link 1 link 2 link 3 3: Since speed matters: If you can avoid images do it. If not; Use CSS sprites and the background property. If You must use img tags, always add a height and a width, that way the browser won't have to read the image size and the page will render quicker.
| ReneReinholdt0 -
Old pages still crawled by SE returning 404s. Better to put 301 or block with robots.txt ?
Hi Matteo. The first step I would suggest is determining the source of the links to these 404 pages. If these links are internal to your website, they should be removed or updated. The next step I would recommend is to ensure your site has a helpful 404 page. The page should offer your site's navigation along with a search function so users can locate relevant content on your site. I realize that thousands of broken links may seem overwhelming. It is a mess which should be cleaned up. How you proceed is dependent upon how much you value SEO. If your ranking is important and you want to be the best, you will have someone investigate every link and make the appropriate adjustments such as 301 redirecting them to the most appropriate page on your site, or allowing the link to continue to the 404 page. It's a search engine's job to help users find content. 404s are a natural part of the web. There is nothing inherently wrong with having some 404 pages. Having thousands of pages really shows your site has significant issues. Google's algorithms are not revealed publicly but it's logical to believe they may consider sites with a high percentage of 404 pages less trustworthy. This is my belief but not necessarily that of the SEO community.
| RyanKent0 -
Good category pages - do you have examples?
thanks for the response. not sure about the smileycat link. They all look good but none of the category pages have text on them for optimization.
| DavidLenehan0 -
De-indexing search results noindex, follow or noindex, nofollow
Each situation is unique so there's no one answer. Here's a great article by Lindsay on options.
| AlanBleiweiss0