Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
Requesting New Custom URL for Google+ Local Business Page
I was just "offered" is a good word, a custom domain by Google last week. At first I thought great let me choose my username that I like (+vmialik) and use for things like Facebook, etc. to keep the consistency. But Miriam and Mike Blumenthal are right when they say Google did not give me an option to customize my domain. But having a full name as a custom url at least for my name works well. For businesses and organizations, I understand its a different story, with DBA's etc. Maybe in the future ... who knows. Hope this helps
| vmialik0 -
How do I fix my sitemap?
Hey Paul, I'm attaching a screenshot of the "Location of your sitemap file." I think these settings - which I didn't change - are correct. But, if I need to change something here, please let me know. Thanks for help! I really have no idea what I did to cause this problem, but I guess, as long as it gets fixed, that's all that matters. Best, Ruben 2aC2XPI.jpg
| KempRugeLawGroup0 -
Disavow- What Happens and What Should I Do?
Had you received an unnatural links warning at all? Usually people suggest not using the Disavow Tool unless you've received that warning. What does your link profile look like? And have you worked on gaining more relevant links and social signals in the time being? If it was an algorithmic penalty in July then it was possibly Panda related but, as I don't have any more information to go on, that's just speculation. Assuming it was Panda then your main issue was likely thin content not links. How does your site look concerning duplicate content, spammy content, and thin pages?
| MikeRoberts0 -
Penalized for Similar, But Not Duplicate, Content?
I believe that google would consider these to be "cookie cutter pages". That is when you have a standard page and several words or phrases are swapped in and according to the product. Google started targeting these pages several years ago and they are getting better and better at identifying them. Their response to these types of pages can be.... -- filter most of them from the search results, this only hit the cookie cutter pages, but if you had lots of these pages it was a waste of the page rank that went into them that could have been used to support rankings on other parts of your site. -- recently these are starting to result in a panda problem which can tank the rankings across your site When you are offering these types of products the time required to write unique descriptions can pay off big time. My approach is to make one page for several closely related items, offering all of them on a single page. That makes a rich page instead of multiple near-duplicate pages.
| EGOL0 -
Directory Quality for Citation Building
No problem. A big give away is normally the URL, for instance anything with numbers in is a big red flag. I have just updated our profile as I noticed we hadn't included our websites. Hope you find some useful citations on the list! J
| EveronSEO0 -
This is a clear-cut canonical issue, right?
yeah, I think it's best to suggest they rewrite their blogs before re-distributing now.
| Martin_S0 -
Google’s Hummingbird and Keyword Cannibalization
Tell your client he is on the right track with wanting to target long tail keywords (that will make his ego feel better) but that it's not so simple as adding one keyword to all the pages. After you choose which page to rank for "oak furniture" (my guess would be the home page or main oak furniture page), choose which long tail phrases containing oak furniture you want to rank for, then choose pages (if different from the main oak furniture page) that would be a good fit. For example, you may have different pages for "discount oak furniture" vs. "taking care of your oak furniture." Do the same with other variations like oak table, oak chair, etc. I would recommend doing this keyword research first and then taking that to the client with a plan of action. If this is too manual, try using your existing product browsing categories as top-level keywords (so all the oak chairs have "oak chair" in the title, etc). This will get you a bit more granular than just blanketing the site with the broadest possible term. Your client should still be prepared that only one or two pages from the site will rank for that broad of a term.
| RuthBurrReedy0 -
Given the new image mismatch penalty, is watermarking considered "cloaking"?
What I can tell your from the point of a big webshop is that we had to remove all our watermarks (it was just our logo for protection against copyright infringements through our competititors) within 3 business days to avoid to be banned from Google Shopping. So this issue shows the direction Google moves on regarding the image search... If, what you describe, is just the first step, then this will or even is the next one. I would avoid this kind of watermarks as well...
| dotfly0 -
Please help :) Troubles getting 3 types of content de-indexed
Hi Fabio If the content is gone when you visit your old URLs do you get a 404 code? You can plug the old URLs into urivalet.com to see what code is returned. If you do, then you're all set. If you don't, see if you can just upload a robots.txt file to that subdomain and block all search engines. Here's info on how to do that http://www.robotstxt.org/robotstxt.html -Dan
| evolvingSEO0 -
Targeting local areas without creating landing pages for each town
My pleasure, Silkstream. I can understand how what you are doing feels risky, but in fact, you are likely preventing fallout from worse risks in the future. SEO is a process, always evolving, and helping your client change with the times is a good thing to do! Good luck with the work.
| MiriamEllis0 -
Dates in the URLs for a "hot" content website (tipping service)
Yes, using a date structure in URLs can help search engines understand the date context of the information. For example, CNN uses a date-based system, like this, where the date is right after the URL: /yyyy/mm/dd/: http://politicalticker.blogs.cnn.com/2013/11/06/documents-show-first-days-of-obamacare-rollout-worse-than-initially-realized/?hpt=hp_t1 ABC News uses a similar structure: http://abcnews.go.com/blogs/politics/2013/11/the-notes-must-reads-for-wednesday-november-6-2013/ As does the NY Daily News: http://www.nydailynews.com/blogs/theraces/2013/11/daily-news-staff-aqueduct-selections-for-wednesday-november-6-2013 Hope this helps... Jeff
| customerparadigm.com0 -
Sitemap Issue - vol 2
Looks fine to me. Ran it through several xml sitemap validators and they all came back as valid. Only change that would make sense to make it to update the urlset line to: <urlset xmlns:xsi="<a class="attribute-value">http://www.w3.org/2001/XMLSchema-instance</a>" xmlns:image="<a class="attribute-value">http://www.google.com/schemas/sitemap-image/1.1</a>" xsi:schemaLocation="<a class="attribute-value">http://www.sitemaps.org/schemas/sitemap/0.9 http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd</a>" xmlns="<a class="attribute-value">http://www.sitemaps.org/schemas/sitemap/0.9</a>"> (Taken from Yoast)
| OlegKorneitchouk0 -
Town and County pages taking months to index.
Not 100% sure but those "rogue" pages look like pagination pages and no-indexing them could cause some issues. They don't have any content on there so I am not sure. Also, this is just correlation and I'm not saying its causation but Google+ links have indexed my sites extremely fast and the sites that utilize Google+ tend to have higher percentage of indexed pages. Again, correlation not necessarily causation. Search Engine Land had a good article on SEO problems with pagination here.
| DarinPirkey0 -
Local Schema
Each location you have should have it's own landing page/url. I'm not sure how the search engines would handle more then one of the same type of schema on one page but I know for SEO purposes two separate pages is much much better. You'll be able to add citation links to each of the pages that way too. Matt Cutts mentioned it's better to have one url per store too.
| DarinPirkey0 -
Getting Your Website Listed
Hi, Here you go for an excellent write-up regarding local SEO: http://moz.com/local-search-ranking-factors Best, Devanur Rafi
| Devanur-Rafi0 -
Does text, initially hidden within a tabbed structure, carry the same weight in Google?
Hey Mike, if you'd like to see how important rich content, and content above-the-fold is, turn on the SERP overlay and do a Google search for "best time of year to go to Tahiti". Have a look at the backlinks data for the #1 result (that's me), and compare that to the ones below (TripAdvisor, Frommers, USA Today, etc.). Now, look at my page, and then look at the TripAdvisor page, paying attention to what non-template, non-navigation, non-clickable content is above the fold. And look at the size of the images I trust you're convinced now so let's move on to your next question. Google has spent a lot of time analyzing what users respond well to, and I'd say if their data shows that it's big images and more text, they're probably right. Keep in mind, you'll have a very low bounce rate if users do NOT find what they want on that page, but think they might by clicking the button next to one of the hotels. If they bail out after that, it still won't look like a bounce in the stats. You could also consider changing up the layout a bit so that the search hotels form is off to the right (maybe 1000 pixel mark or so), pull in the first sentence or so from the hotel description, and use the larger image of the hotel there. You also have a lot of vertical whitespace in there. While your style is very Web 2.0 and clean, an open style with a lot of whitespace unfortunately does push most of the content a fair bit down the page.
| MichaelC-150220 -
Product Category Subcategory hierarchy
I typically keep category names out of the product URL. It creates problems, especially when the product lives in multiple categories. You can use a rel canonical tag, but I'd rather just not have to deal with it. Here is how I do it: www.domain.com/products/product-name/ www.domain.com/category/category-name/ and www.domain.com/category/category-name/sub-category-name/ This convention has several advantages, including easy segmentation of the site to determine, for instance, how many product pages are indexed in the SERPs. One could argue that having the category name in the product URL is good for SEO because of the keywords, but I would argue that putting the product farther down in the site structure, and the non-canonical URL issues related to certain taxonomies (e.g. multiple categories for a product) negates what little benefit keywords in the URL have these days. Of course not all eCommerce platforms really allow the structure above. Magento, for example, will allow you to put products in the root, but not in the /product/ folder. The product will also be viewable on the category-version of the URLs, but they will have a rel canonical tag pointing to the root directory version. Good luck.
| Everett0 -
Will I mess with Authorship if I setup multiple client websites under my Webmaster tools login?
You should be just fine with G+ authorship as long as in the clients' pages you have the rel=author links in there. Yes, there are alternate ways of establishing G+ authorship, but it's my understanding that the explicit 2-way links are the core means of identifying the author-document relationship, and that the alternate methods are just that. I think it's pretty safe to verify these using Google's own structured data test tool. Yes, it has a few bugs, and fails to read some websites, but it's overall pretty reliable, and I would expect that it's using a very similar bit of logic to determine authorship that the SERPs-producing code is using.
| MichaelC-150220 -
How to treat low-value and automated links during the link pruning process?
Hi there I often find that it is easier to take value out of the equation when looking to removing/disavowing links, and focus more on how that link got there. Your first example, the link may not be doing any good or harm and it might look out of place, but there's no doubt the link is a "natural" recommendation, in a sense that it hasn't been deliberately built by the company it links to. Such links wouldn't be a problem at all, in my opinion. For the second example, again while this looks like an odd link, I believe it is clear to see that a webmaster has not gone out and built that link deliberately. That is what Google is ultimately looking for. Automatic crawlers, particular statlog and others that will be well known to Google, won't need removing or disavowing - as I'm pretty sure the algorithm (plus manual reviewers) just discounts them anyway. If a link looks deliberately placed or produced by someone with a vested interest, automated or otherwise, that's where I think the alarm bells start ringing for quality checks etc. I believe it's why terms like "link earning" - getting people to link to you purely on the strength of the content and assets you produce - has proven to be so popular recently. Google will reward those kind of links all day. Your first example is an example of that, albeit a small one. Links like in your second example will just be discounted, but not actively penalised. It's interesting to consider that, once you see links as being "earned" or "natural", such as the client recommendation in the first example, this is where some SEOs have manipulated the algorithm and done so quite well. They replicate these "natural" looking links. Where most slip up is that they leave a footprint or visible clues that they're doing it deliberately. I don't advocate you do that at all - earn those links by being an awesome company - but it's always good to know how things can work and/or be exploited, as you can learn to avoid pitfalls that way. Hope this helps!
| TomRayner0