Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Behavior & Demographics

Learn more about search behavior and demographic trends.


  • Hi Darren, I can't say that I know of a tool that will do this for you and there's a pretty good reason for that - there's no black and white way of knowing what keywords each page targets - not since search engines stopped paying attention to the Meta Keywords element many moons ago. There may be some tools out there that I've never come across but the best they could offer would be an arbitrary estimation of what you're looking for; certainly nothing concrete. The closest I could offer would be SEMrush. It will give you keywords that your site ranks for but from what I understand, it's extrapolated data meaning it isn't necessarily very accurate either. It's basically a ranking report at the domain level, ie what terms does your website rank for rather than showing which page ranks for what term. Hope that makes sense!

    | ChrisAshton
    0

  • Although being hacked seems like the only reason I can think of at the moment, I am still scratching my head! Its only showing the HTTPS urls in the Google search results. Not in Bing or Yahoo etc. I can't see any of the HTTPS urls when I perform a crawl of the site in programs such as Screaming Frog.  I could understand if we were hacked that it would show spammy results with dodgy links but the results aren't spammy, the Titles and Descriptions are from one of our customers websites? Their site has not been hacked though.

    | Jvickery
    0

  • Well for one thing keeping and maintaining a keyword map that is map out to all the url's you want to target might help. Also, consider adding modifiers on keywords that you suspect might fall victim to cannibalization.

    | JordanLowry
    0

  • Looking at your setup, I would have thought it should work so I'm a little confused. if you're still having trouble, you can use filters in GA to custom extract the search term from the URL and pass it through to GA. check out Lunametrics guide on how to do this: http://www.lunametrics.com/blog/2015/02/24/enable-site-search-reporting-google-analytics/ regards, Sean

    | seanginnaw
    1

  • Sounds great! Let me know what impact you see from the changes, curious to hear!

    | Joe.Robison
    0

  • I have to agree that the first thing to consider (well, after fixing those broken links) is how your content ties into your conversion funnel. What compels me to stay on your site once I've found the answer I'm looking for? For example, if I'm just looking for recommendations for fans and I land on your post about them, once I've read, why wouldn't I click right back to Google? Out of curiosity, what constitutes a conversion for you? If a visitor doesn't bounce away, what would you prefer them to do?

    | MattRoney
    0

  • Wow! Thanks for the info! I have just been doing the Usertesting myself at varying locations until I find a good replacement or supplement.

    | RoxBrock
    1

  • OK... I know your site.   In that wasted white space at the top you should be offering everyone an opportunity to subscribe to breaking news and get links to breaking stories by email as soon as they are posted.  Allow them to subscribe by team, or league, or trades, or salary, or coaches or goalies, or Stanley Cup. Back to your original question...  Start experimenting and keeping records.  You have the advantage of a strong site and a relevant site and a site that is quick with the news.  It could behave differently in different situations.

    | EGOL
    0

  • Hi Rebecca, Yes the content on each city is unique (not great, but unique). When they were on subdomains they were absolute replicas of each other apart from city name switches and maybe some short intro text, but other than that, complete copies. Now those old subdomains get forwarded on to each new subfolder page so; city-location.maincompany.com gets forwarded to maincompany.com/uk/city-location/ Alex

    | SeoSheikh
    0

  • In terms of whatever help Google might give to a site with https vs http, no, this doesn't really matter: What type of SSL certificate works best? Companies offer a myriad and confusing array of SSL certificates. The two primary ones to pay attention to are: Standard Validation SSL – Standard level of validation. Typically cost between $0-$100. Extended Validation SSL – Offers the highest level of validation and often costs between $100-500. From a rankings point of view, it makes absolutely no difference what type of certificate you use. For now. John Mueller of Google has stated that Google doesn't care what kind of SSL certificate your website uses, but that may change in the future. For the reasons outlined in the blog post that Gaston linked to by Cyrus, you'll see that it could make a difference from a UX perspective since the EV certificates can look more trustworthy: [image: 540d50cd94bd58.88267717.jpg] https://d1avok0lzls2w.cloudfront.net/uploads/blog/540d50cd94bd58.88267717.jpg For an enterprise, the extra cost should be negligible. However, you should know that some of the EV certificates will only work on one subdomain, so you may need to purchase multiple certificates if you want to cover extra subdomains. "Wildcard" certificates that can work on multiple subdomains have only recently been available for EV certificates and are a bit more expensive I think.

    | KaneJamison
    0

  • Hi Danny! How'd that crawl end up? If Andy answered your question, mind marking his response as a "Good Answer?" It'll get him some bonus MozPoints, and it helps us keep track of things in the forum.

    | MattRoney
    0

  • Hi there. Well, Analytics has only couple hours delay, not days, that's where I'd look. Also even Google rankings algos have some delay, so you won't really be able to 100% see how the changes affected the positions. Also look at past year behavior, if it's seasonal, you will see the repetitive traffic behavior in Google Analytics. Other than that I don't think there is anything to go by immediately besides intuition and experience. Hope this helps.

    | DmitriiK
    0

  • I know   I just took your language topic to say it rankbrain iproves understanding. Don't thought you excluded other languages rly. Special with such a typical german lastname

    | paints-n-design
    0

  • Thanks John, Always nice to get a second pair of eyes on my code. I made all the changes you described and ran another crawl test. Turn's out you were exactly right! The <divclass...>missing a space was the main issue.</divclass...> That very tiny error stopped MOZ from crawling any and all content within that main content div, so it couldn't see anything else except for the duplicated navigation, header and footer elements. THANK YOU SO MUCH!!!

    | jeremyfleischer
    0

  • It appears that Google is indexing your homepage but simply not keeping a cache. I checked and there doesnt appear to be anything in particular that would prevent Google from caching your page. It is interesting to note that the larger pages on your site in terms of HTML, like your home page are not cached. Maybe that is the issue. I am making a guess here, but I bet if you get rid of those giant base64 encoded images and instead load them externally, you will get your homepage cached. Give it a shot.

    | rjonesx. 0
    0

  • I'm looking into landing pages now. trying to find a decent plugin for call to action and landing pages. l looked at inbound now but the bad reviews gave me concern

    | cheaptubes
    0

  • I noticed MozPoint rank was pretty close to the aggregated rank for everybody in the Top 50 list. But when you look at all 4291 users there is some big differences. I think maybe in June I'll do just a top 1000 or have a minimum point requirement.

    | donford
    13