Dear Moz,
Another error, the following url loads an empty page https://moz.com/local/categories
Please review
Thanks!
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Dear Moz,
Another error, the following url loads an empty page https://moz.com/local/categories
Please review
Thanks!
Dear Moz
I've received your email about Moz Local. A fantastic tool but it does not allow you to download a template. Clicking 'Download this template' simply reloads the page.
I am testing it under incognito mode of Chrome with no add-ons
Thank you!
I'm in the planning stage of a new ecommerce page. To reduce duplication issues, my page will be static with 20% of the page compiled of dynamic fields.
So when a user selects a size, or color, the dynamic fields are the only ones that change as the rest of the content is the same. I can keep a static URL and not worry about duplication issues. Focus can be on strengthening this single URL with rich schema, reviews, and backlinks.
We're going to cache a default page so for crawlers, the dynamic field doesn't appear empty. My developer said they can cache the page with all the variants of the dynamic fields, and use hidden DIVs to hide them from the user.
This way, the load speed can be high, and search engines might crawl those keywords too. I'm thinking about and going.."wait a minute, that's a good idea..but would a search engine think I am hidding content and give me a penalty?". The hidden content is relevant to the page and it only appears according to the drop down to make the user experience more "friendly".
What do you think? Use hidden DIV or use javascript to not allow bots to crawl the hidden data at all?
I wanted to know ones thoughts on reducing duplication by creating a static page with a few dynamic fields. Has anyone done this?
If 75% of a page is static and 25% is dynamic, is that a good ratio? It's an idea I am thinking about to combat duplication issues affecting ecommerce pages. Some ecommerce sites generate a new page for a small change like size, but the content is the same. What if you could create a single static page and depending on the size chosen, only those fields connected to the size are dynamic? Everything else remains the same.
For caching purposes, you always submit a cached page with default values.
Wouldn't this work? Isn't this a solution for duplicate ecommerce pages? It would also help in ranking, rather than multiple external links across duplicate ecommerce pages, it would just be to a single page.
My approach is thumbs up with a smile. Universal and makes everyone feel epic! Including myself 
A thought has crossed me. I've started to think about my relationships (uh oh...where is this going..?) with my web developer and designer and it seems design has become the core part of every SEO strategy I employ.
Every page layout, rich content on a page and behaviour my customer might do, I'm thinking about the design. Are we in a way, designers as well? We put on so many hats and is design one of them? I'm technical with my web developer but I'm also concerned with the delivery the customer sees.
How is your relationship with your web designer and developer? Are they core individuals in your SEO life?
For a personal project, I implemented a CDN to my site (MaxCDN). The CDN now delivers every image via a subdomain and the CDN has sped my site's load speed.
My goal from the start was speed, and in fact I got obsessed with load speed as I wanted to score over 90 in Google's PageSpeed and GTMetrix. There was another element at play and it was the most crucial one - the customer. I'm impatient when it comes to web browsing and I know I'm not alone. We know what we want when we click links, and we expect it to load fast. I am my own customer so speed was important.
I name my image files with SEO in mind, using dashes, key phrases relavant to the image and alt tags, but I also know images aren't the sole driver for link backs and leads. Your site is an ecommerce site, if you have up to 21 images per page, speed is the importance here.
To conclude, I would put yourself in your customer's shoes and ask, what do I want when I visit your ecommerce site. Will a slow site frustrate/make you leave? Will speed change your experience with the site and thus make you browse more? How are you showing such large images? Is the user experience fluid?
I don't think there ever is an individual who knows the right answers to everything when it comes to SEO. We're all exploring ideas, learning and sharing knowledge of our own findings and research.
Let's step outside the SEO world, throw away our knowledge and look at the website. Would you say, it is a website your client should be on? If your client is indeed an internet company made in NYC, shouldn't they be mentioned in NY Tech Meetup? From this perspective, I would say yes.
My gut feeling is that, Google won't penalise a website who is an internet company made in NYC and listed on a non-profit organization website with a nofollow link. It seems like a natural fit.
Second, I'm looking at opensiteexplorer, the page has a page authority of 78/100 and 174 root domains including some big power houses like the guardian, bloomberg, forbes. (didn't see any nofollow) I definitely think these are helping.I remember working on a client's webpage once, we optimised the page with rich content, clear call to actions and it was ranking on page 2, got 2 hyperlinks from the BBC and another high authority website and two weeks later, "boom", we were ranking on page 1, position 4.
Now let's explore the "black hat" technique. The core one would be the requesting of reciprocal links with the anchor text "Made in NYC" hyperlinked
You are right, that is "black hat" if I saw someone else do it, but in this scenario, I would go "that's fine". It all depends on the situation.
You see, if I were to move this whole concept to a real world scenario, where nytm was a shop and they had a book listing other shops built in NYC, would you penalize them? "you sir, should not list such shops nor should other shops say you have a list of NYC build shops!"
In all honest, I don't see what they are doing as a big no no. I think things should be looked at as a case by case scenario, not to cluster everyone as a single group.
You mentioned why can't you create a .org page similar to this.. I say why not? Note their directory page isn't the core of what they are. This is just a single page of their entire entity and I think that plays a lot in their strength in reaffirming their presence in the web.
I realize you are frustrated, and all of us have our own thoughts. My thinking has always been to compare things to a real life scenario and focus more on creating great content that others will link to, rather than chase it myself. Sure, they might all use nofollow but they clicked through wanting to see my page and I'll let my rich content, site design and clear call to action turn them to a returnee.
Don't fret my friend. In a weird way, this is the perfect board to vent out and hear everyones thoughts and ideas. I hope my thoughts haven't put you off
Maybe it has all to do with the site itself and the "human" approach Google is taking.
What I'm trying to say is that, maybe Google is evolving to be more complex/human like. It's doing a moral mind on decisions.
On the flip-side, it's just a software that forgot to notice the site amongst the millions of websites out there and in time, it will capture them.
I noticed in the latest Yoast newsletter that they are releasing themes in February. I'm a big fan of the Yoast SEO pluggin so his themes might be something to look into as well. No doubt, they will be seo focused.
Here's an online link to the newsletter http://us1.campaign-archive2.com/?u=ffa93edfe21752c921f860358&id=958ca84faa&e=6d12308613
Hope that helps
Thanks Alex,
I do have canonical tags on the webpages to ensure they are seen as the main one. I'll look into tracking subdomains.
If a website provides PDF versions of the page as a download option, should the PDF be no-indexed in your opinion?
We have to offer PDF versions of the webpage as our customers want them, they are a group who will download/print the pdfs. I thought of leaving the pdfs alone as they site in a subdomain but the more I think about it, I should probably noindex them. My reasons
On the flipside
What are your experiences?
I agree with Michael, it's about identifying the problem and dealing with it.
A valuable tool I use is google's text cache to see what spiders see http://webcache.googleusercontent.com/search?q=cache:http://www.njhypnotherapy.com/&strip=1
Are spiders seeing your key term? You've got testimonials as images which are great content users will find useful and so will search engines.
Also think about the homepage, what should it really rank for? We tend to think "I need my homepage ranking for core keyphrases" but should it? I believe the homepage should rank for a generic search phrase and appear for brand searches. Other pages on your site should focus on the specific keyterms you want to target. In your case, the homepage should be focused on "hypnotherapy" and pages built for 'hypnosis for anxiety', 'hypnotherapy fear' should rank for them.
I would even do small changes like change the title to read: **Hypnosis and Hypnotherapy in New Jersey, NJ | Imraaz Rally C.ht, NLP **and reword the meta description to bring hypnosis and hypnotherary more in front. Users land on your page searching for hypnosis and hypnotherapy in NJ, your name/brand seals the deal in my opinion. Not the other way round.
Interestingly, I can't see your site appearing as a local search when I searched for hypnotherapy, nj. This would be ideal for you and could drive traffic that is local and relevant.
I can see you are using authorship on your blogs, but your image isn't appearing in search results. Have you set it up correctly? In your Google+ have you linked the site and set yourself as a contributor to the site?
I would also look at things like introduce schema.org to microtag your site for search engines.
Rework the structure and your good content will build external links, you can even go beyond and do things like trustpilot reviews and run some location specific PPC ads that will show up the rating stars from trustpilot
Thanks, I'll look at manually specifying these parameters and see if they make an impact.
Thank you streamline,
That's interesting, I have provided 'searchType', 'searchTerm', 'search', 'cat', 'filter2name', 'filter1name' as URL Parameters
We have 49 Parameters listed and given 'Let Googlebot decide'. I thought adding the parameters here would avoid google from indexing those URLs? I believe our setup already does this?
What do you mean by "multiple ways"? We have a search page which isn't indexed and internal links from pages but that wouldn't count would it? It's not like the URL string changes from a search page or internal hyperlink?
There are no blog posts, it's an ecommerce site and every product page and article page has the URL www.domain.com/.
I even looked at my GA and it reports 14,000 pages
If there was a tool to export all the search results, I could've manually looked into why the big count.
I have 14,000 pages on my website, but when I do a site:domain.com search on google, it shows around 55,000.
I first thought.."hmm, maybe it is including subdomains". So I tried site:www.domain.com and now it shows 35,000. That still is more than double the pages I have.
Any ideas why? When you filter a google search using "site", isn't it meant to pick up just that site's pages?
*P.S I tried using the SEOquake add-on to download search results as a CSV file to review, but the add-on only downloads the first 100 search results 
Will your old site be replaced with the new website? If so, I would place the message on top, so it appears above the header. This message pushes the site down
The message could have a 'continue' button and once clicked, it will dissapear pushing the site back up. This way, everyone will see it no matter what landing page they arrive.
We do something similar for our cookie message