Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Thanks for sharing that link. That's interesting that Google is telling you to create a 301 for desktop to mobile redirects. I've always used this script and it's worked great never negatively impacted my Mobile or Desktop SERPS:

    | BrianJGomez
    0
  • This topic is deleted!

    0
  • This topic is deleted!

    0

  • The only caveat is this: SEMrush scrapes the top 20 results. It does not, however, scrape a 10-result page 1 and a 10-result page 2. It scrapes a 20-result page 1. If you look at 10-result SERPs and 20-result SERPs, you'll see that they're slightly different. It's much more likely that you'll see multiple results from the same domain on a 20-result SERP. So be aware that if SEMrush tells you that you own the #2, #3, and #4 spots for a given keyword, that's probably not true on the standard 10-result SERP.

    | CMC-SD
    0
  • This topic is deleted!

    0

  • Unless you want to go to the trouble of making unique content for each page, I would suggest combining into a single page Although, If it was me, Id make the webs best page about pink sweatbands, so good and so over the top that it attracted its own links - think of this https://www.google.co.uk/search?q=opink%20swetabands&rlz=1C1CHFX_enGB502GB502&sugexp=chrome,mod%3D14&um=1&ie=UTF-8&hl=en&tbm=isch&source=og&sa=N&tab=wi&ei=iUZzUNTjAu3C0AX48IEo&biw=1280&bih=923&sei=i0ZzUJH7H6PB0QWFpYGoDQ as a page on your website. It would be amazing! S

    | firstconversion
    0

  • thanks Keri

    | casper434
    0

  • Hi Tom, Great question. There has been a ton of discussion of this, of late. Rather than try to re-write all of this, I'd like to link to a couple of pieces that give a good assessment particularly of Yext, and their NAP correction service. Here is one from Nyagoslav Zhekov that actually received a personal response from Yext's Howard Lerman: http://www.ngsmarketing.com/why-yext-might-not-be-the-best-fit-for-your-business/ Here is Mike Blumenthal's take on Yext: http://blumenthals.com/blog/2012/03/01/yext-local-seo/ And here is David Mihm's: http://www.davidmihm.com/blog/seo-industry/yext-local-marketing/ I recommend you pay very careful attention to this part of Nyagoslav's article: "NAP consistency – the main advantage of the service is that it rectifies the business information across Yext’s network, thus helping up the local search rankings (I have previously discussed the value of citations for local SEO). I have been “promoting” the Local Search Scorecard, an instant scanning service Yext offers for free, as a useful local SEO tool, too. But if we look deeper, we could see that there are fundamental problems with the Power Listings service and its helpfulness in terms of keeping one’s business NAP consistent. How the service works is: It scans all the network’s websites for the specified business information (name, address, phone number). It determines the best matching listings (if any) on each website and highlights them. It then syncs the data input via the dashboard on all these websites, updating the listings that previously have been determined as best matching, and “filling in” with new listings where it was unable to find matching ones. The three main problems here – first, it is possible that Yext’s scanning will not find the correct listing (happens often when the same business has two offices in the same city, or when a business has been using the same phone number for more than one of their locations); second, it is possible that Yext’s scanning will not find any listing, although there is one (happens if NAP is very inconsistent); third, it is possible that there are duplicates, and as Yext finds only one listing per website, these won’t be taken care of (happens almost in every single case). What this all means is that the chances for Yext to not clear up your NAP completely, or to actually screw your other location’s NAP, are pretty high. I haven’t made a large scale research on this matter, but according to my observations there are at least a few wrong or correct duplicates for a business in at least 80% of the cases. " Please, check out those resources regarding Yext. Regarding Localeze, I have been part of discussion about similar problems, but do not have a published source to quote on this. Others might have a different opinion on this, but basically, every Local SEO I know agrees that there is no replacement on the market right now for manual citation correction. Hope this helps!

    | MiriamEllis
    0

  • Joseph I'd make each page totally unique to the city. Don't worry about/focus on getting a penalty or not. Make the pages so useful for the user that if you have a sentence or two that's similar you won't get a penalty. Useful to me would be information that's entirely specific to what the page is "about". A page with some testimonials and an in-depth case study, photos and useful info in regard to that location should really deliver and give people what they need to know about the services in that location! -Dan

    | evolvingSEO
    0

  • What was causing the session ID to form a URL (page) structure like that? It should append to the page URL  dynamically "?=(S(whukyd45tf5atk55dmcqae45)" Has that been fixed? Here's something that may help: Wherever possible, avoid the use of session IDs in URLs. Consider using cookies instead. Whenever possible, shorten URLs by trimming unnecessary parameters. http://support.google.com/webmasters/bin/answer.py?hl=en&answer=76329

    | BrianCrouch
    0
  • This topic is deleted!

    | xpd
    0

  • Hi Marcus, OK I can tell you what I know as I only got involved on this the summer, some 12 months after the SEO contract finished with the old web provider, nothing actually happened thereafter. I don't think we are working off a penalty, however we did have a lot of problems when the new site was launched last november with duplicate data 1000's of 301 redirects, 2500 pages 404'ing etc. This has only resolved it's self int he last 4 to 6 weeks after a lot of heart ache and dropped revenue. 404's are at 840ish and dropping daily now, duplicate content is down to 28 urls and dropping too. All because of bad advice hence why  we are now looking after things in house. Competition are your Amazons, Marks & Spencer, towels.co.uk, Christie Towels, Linens limited. I think we can better some of these for sure. In terms of link build previously I have no idea how this was done. There was no blog, no social media (still none but looking at this) so I guess normal linking building routes. There link profiles are massive in comparison to ours and i wonder how many we have lost because of all these bad 404 pages etc. So i would like to capture and re-direct anything that got value before the fall away for good. I have tried signing up to link detective but they are not send a authorisation email so can;t gain access frustratingly. Any more info i can provide let me know. Thanks Craig

    | Towelsrus
    0
  • This topic is deleted!

    0
  • This topic is deleted!

    0

  • also, Google will sometimes replace a homepage title that is full of keywords with the brand name. This is usually fixed when you put your brand name back into the homepage title

    | MarkHodson
    0
  • This topic is deleted!

    0

  • Is there documentation somewhere that says that alerts work differently when you also use webmaster tools?

    | IanTheScot
    0

  • great code Josh...but , after i saved it on .htaccess , a "?" appeared on the link.. http://www.domain.com/?/example/file.html Is this ok ? pls advice/ Thank you,

    | willyg
    0