Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
Dynamic XML Sitemap Generator
Are you running a CMS? If so maybe it has one built-in or available as a plugin - for example the Yoast SEO plugin on WordPress will create a sitemap, has plenty of inclusion & exclusion rules, (including a per-page override), and helps you do a bunch of other stuff too. I'm most other CMSs have something available by now. If it's a custom-built dynamic site maybe your development team, (or developer), can add sitemap functionality. If you have a lot of pages, backed by a custom database, the best way to make sure your sitemap is up to date will be to generate it based on your data, not let some 3rd-party tool follow links around your site - Google can do that already! If you are not running a CMS, or a custom app, how many pages do you have? It might be easiest to write the XML sitemap by hand, (although I'm a developer, so that's easy for me to say), or try out Logan's recommendation.
| 4RS_John0 -
Problems with US site being prioritized in Google UK
Hi Chris, With ccTLDs, you have your geo-targeting setup for the UK. You do need to set up the .com for the US if that is what the content is targeted to as Logan mentioned. As to why the .com is showing higher, is the content identical between the two sites? If not, how are they different? In the mean time, you can detect their location using IP and ask anyone on the wrong site using a JavaScript prompt if they would like to be on the other version. Give them the option, never redirect based on IP. You'll need a developer to do this.
| katemorris0 -
Purchasing an existing domain + redirecting to company's domain
A 301 is counted by google as a link, both John Muller and Matt Cutts said that few times. So it's just a link. But since it's from a competitor domain, it is potentially a very powerful link, because google algo is believed to value links from websites within your topic/business/field much more than just any link. And I have seen that work that way with my eyes a couple of times. "Potentially", because of course it does depends on many factors, like that competitor metrics, such as DA/MT/TF/etc... If the competitor domain is well established, I wouldn't worry too much about the redirect being flag as manipulative by search engines. Of course if it's the best choice to 301 the root, 301 with some logic, or keep the website online as it is... Depends on many other factors, not all SEO related.
| max.favilli0 -
H1 tag found on page, but saying doesn't match keyword
I checked the source with my default user agent (in this case Firefox) and did NOT see an H1 tag. I checked with my user agent set to GoogleBot and DID see an H1 tag, which did have that keyword phrase in it. I checked again with a default user agent, but this time with JavaScript disabled, and could not see anything at all on the viewable page (blank white page), though the source code was there without the H1 tag. So it seems to me like you're pre-rendering the page for GoogleBot, and are including the H1 (and other header tags) as part of a fully-rendered page for search engines. However, because that Header tag does not exist if you turn JavaScript off - or if you're not Google - there may be a risk of Google seeing this page as "cloaking". Pre-rendering is good. It's not a "bad" type of cloaking if you serve the EXACT same page to search engines that you serve to everyone else. Unfortunately, this does not seem to be the case with the way this page is set up. Google sees one thing, other visitors (with or without JavaScript enabled) see something else. I know developers are head-over-heels for single-page apps and JavaScrpt frameworks, but this stuff is starting to drive me nuts. It's like trying to optimize Flash sites all over again. On the one hand you have Google bragging about how great they are at crawling JavaScript, even going so far as to say pre-rendering is not necessary... And on the other hand there are clear, sustained, organic search traffic drops whenever developers start turning flat HTML/CSS pages into these single-page JavaScript framework applications. My advice to you is that if you're going to Pre Render a page for Google, to A: make sure the page a user with JavaScript enabled sees is exactly the same as what Google sees, and B: See if you can pre-render pages for visitors without JavaScript enabled as well.
| Everett0 -
How do i migrate from Volusion to Magento with the same domain using 301 redirect?
Hi Kevin; You should use Volusion to Magento migration tool from Litextension to get your job done. This tool supports migration full data including URLs from Volusion to Magento and you wont lost your search ranking. I think Litextension provides the best solution to keep old urls after change to Magento, you can read their solution at this blog post. watch?v=BSFtOzYzD2Q
| alex.nguyen0 -
Recommendations for the length of h1 tags and how much does it matter. What is the major disadvantage if the h1 tags are slightly longer.
Think of your page like the front of a newspaper. Your H1 is your big headline. Short sharp and to the point. People skim pages when reading, but will read headlines. If the H1 catches the attention they are more inclined to read on. Continue skimming down the page of the newspaper (your page) you may skip 'normal text' but your eyes catch the next H2, which is like your newspaper sub-title. I went to a Matt Bailey seminar once, and he did a great piece on how pages catch the attention and how people read/skim pages. He has a book which also covered it.
| ninjahippo0 -
What do you add to your robots.txt on your ecommerce sites?
I'm on this same path since we too cannot use noindex / nofollow due to limited backend interaction with Bigcommerce. I like to block all cart related pages, which for ecommerce sites can be a boat load. /cart.php /checkout.php /finishorder.php /*login.php just to name a few, then you have the sorting and compare pages, they have to be blocked or a mess unfolds. Disallow: /*sort=newest Disallow: /*sort=bestselling Disallow: /*?page= ( Big duplicate page issue if you don't block this one with a wildcard, and cannot access your .htaccess file or the backend properly to noindex / nofollow ) Just to name a few, in my case, I only want the meat of the site to be indexed and rank for. Otherwise one client's site was ranking terms that more related to web development than the niche industry they lived in. Plus with a limited index budget, why would you want google or anyone else to crawl pages on your site with no SEO value towards your niche? Unless you sold carts as in web developed carts for ecommerce sites you wouldn't want much of that indexed anyways, and even in that case, those pages aren't too useful for ranking. At least from what I've gathered in the niche industries.
| Deacyde0 -
What are the best practices for geo-targeting by sub-folders?
I believe you can geotarget subfolders inside GSC, but you will need to set up each subfolder as a separate site inside GSC. Google has some tips here that might be of help. I would also consider using meta language tags as well.
| rjonesx. 00 -
How ot optimise a website for competitive keywords?
Thanks for your response guys. I got your point - need to remove some keywords repeating to make the website look more natural and need to update the site architecture. But I can't agree on "doorway" pages statemnet. Every page of the website has unique content. Infact, there is one of our competitors who OUTRANK US with copy/paste text on every area page like this: "Man and Van service copy/paste,copy/paste,copy/paste,copy/paste,copy/paste, area Barnet" "Man and Van service copy/paste,copy/paste,copy/paste,copy/paste,copy/paste, area Acton" "Man and Van service copy/paste,copy/paste,copy/paste,copy/paste,copy/paste, area Hammersmith" How is that possible? I thought that content is king for Google but in this case a website with duplicate content outrank a website with unique content.
| nasi_bg0 -
Moz page optimization score issue, have a score of 95, but can get to 99 if I ad my keyword basically twice in the url.
And we worked it out! In case anyone else has a similar question: When you pair a keyword to a page in the Page Grader tool—such as the page /laptop-bag/leather-shoulder/ paired with the keyword "leather shoulder laptop bag"—the tool will scan the page for exact instances of the phrase. So, even though the words "laptop," "bag," "leather," and "shoulder" are in the URL, the tool doesn't recognize the phrase "leather shoulder laptop bag," and so suggests adding it to the URL. If the URL is /laptop-bags/leather-shoulder-laptop-bags/, though, the tool _does _see the exact term, so it bumps up the score and removes the suggestion. In this case, though, Deacyde is totally right—/laptop-bags/leather-shoulder-laptop-bags/ is, well, kind of terrible. The thing to keep in mind is that a score of 100 in the tool means that the keyword for which it's evaluating the page is in _absolutely every _keyword optimization position. That doesn't always make sense, especially for a longer-tail term like "leather shoulder laptop bag." So in this instance, 95 is better than 99.
| MattRoney0 -
Sub-directories or Nah
Hi Meier! Very good feedback from the community, and scenario you are describing is why most Local SEOs would recommend the approach of a single domain with landing pages on it for each of the 7 locations. This way, every single thing that you do to build up the brand simultaneously benefits all the locations, instead of this requiring 7 separate efforts every time you want to make a marketing decision.
| MiriamEllis0 -
Why have I lost my #1 ranking?
Hi Taylor, That screenshot is interesting. My assumption would be that Google has seen this duplication and is simply disregarding the other versions of what it considers to be the same content. The reason duplicate content isn't such a huge deal these days (it's not a good thing at all but it won't really get you penalised) is because Google is much better and dealing with it. From my understanding, what you're seeing here is exactly the expectation - they determine which piece seems to be the most relevant and useful to searcher intent based on a number of things, then disregard the others. No point in indexing multiple versions of the same thing, right? As for the smaller bits and pieces, I'll take a closer look at your site again once I'm back in the office this afternoon and let you know
| ChrisAshton0 -
Would it work to place H1 (or important page keywords) at the top of your page in HTML and move lower on page with CSS?
Hey, thanks Matt! I don't contribute for the points but the perks are certainly nice
| ChrisAshton0 -
Robots.txt gone wild
Hi Radi! Have Matt and/or Martijn answered your question? If so, please mark one or both of their responses "Good Answer." Otherwise, what's still tripping you up?
| MattRoney0 -
To subdomain or to subfolder, that is the question.
I agree with this approach, but I also would be trying to dig into _why _they think they want a second site instead of just coming back with a recommendation. What do they think that will accomplish for them? Sometimes clients will ask you to do something that they think will solve a problem when really they should be asking you how to solve the problem, and the situation as he's described it totally feels like one of those times to me.
| BradsDeals0 -
Pagination with rel=“next” and rel=“prev”
Personally I'd use Screaming Frog for this. The URLs for prev/next elements can be found in the "Directives" tab and can be exported to excel for easy comparison. Regards, Nico ... ah, too slow and essentially same answer
| netzkern_AG0 -
Pages with similar content: Redirect or Canonical? Or something else?
Either of these options is ok depending on the context, as Laura mentioned. As always, if there's no real reason for both pages to exist then I'd suggest using a 301 to point that old page to the new one.
| ChrisAshton0 -
Google is alternating what link it likes to rank on wordpress site and
Hi Christy, it's not resolved yet, I'm still seeking help. My rankings are going lower because of this daily and I don't know what to do.
| z8YX9F800 -
How can I track traffic source for each user?
Hi Sida, I was running into the same issue, not being able to track the traffic source for any individual lead submission using Google Analytics. We couldn't find any affordable solutions that offered exactly what we were looking for, so we decide to build a lead tracking tool ourselves. Now we can identify the traffic source of each lead as well as update the status of the leads throughout the sales lifecycle, which thus finally enables us to get a much better grasp of the ROI of our marketing campaigns. Our tool is currently completely free, but we will likely add premium plans in the future for users who want extra features. Feel free to check it out at http://convertable.com and let me know if you have any questions or suggestions for how to improve it. Or if you'd rather simply use your existing form and track leads in Google Analytics, this blog post will show you how to do this with a little bit of coding - http://cutroni.com/blog/2009/03/18/updated-integrating-google-analytics-with-a-crm/
| StreamlineMetrics0