Stephanie, Logan is correct--we used to submit blog posts on YouMoz, and they could have the potential to be boosted up so they would show up differently if the post was good enough. That has been paused for now, and I'm looking forward to being able to contribute again.
Best posts made by becole
-
RE: Is it possible to do guest blogging on moz blog?
-
RE: What is the fastest way to deindex content from Google?
Rosemary, in order to remove the content quickly, you have to do several things. You see, Google's processes for crawling, etc. and removing content from the index don't always happen all at once. So, it's best to do several things:
-
Remove the content. When visitors or bots visit the URL, use the "410 Gone" server header code to ensure that it's not just a 404 error being used.
-
If the content must stay and cannot be removed but still needs to be removed from Google's index, consider password protecting the content, putting it behind a paywall, making users log in to see the content, and/or adding a meta robots noindex tag on the page.
-
Add a robots.txt file on the subdomain so that it tells the bots to stop crawling. If you use something like dev.yourdomain.com for a dev section of the site, make sure that you have a robots.txt file at dev.yourdomain.com/robots.txt.
-
Use Google Search Console to remove the content. Once logged in, use the removal tool: https://www.google.com/webmasters/tools/removals?pli=1
By using several approaches, this is going to be the fastest way to remove the content.
-
-
RE: Directory Listings no longer counted in Backlinks?
Excal, I've always looked considered the links that are shown in Google Search Console to be a "snippet" of all of the links to your website. For example, there are links that I know are pointing to our website that are good links--but they come and go in Google Search Console. They don't show us all the links. So, just as you have seen them go away, there's a good chance that they will come back and be listed again.
I do know that the more often you download the links the more often they will refresh the list of links.
-
RE: Writing unique meta titles for canonicalised products or not?
No, you do not need to write unique meta titles for the others. However, you may want to swap out the name of the color so that the title tag of the page reflects the page that the user is on. This is really more for 'user experience' since than for the search engines or for SEO.
For example, if someone likes one particular color, they may choose to bookmark that page in their web browser. If they bookmark it, then the title tag would appear in the title of the bookmark--so having the correct product information there (the color) would be helpful.
-
RE: Google Indexing Pages with Made Up URL
Brian, when this happens, there is typically one reason: somewhere there is a link with that URL in it. What we've seen before is that oftentimes those links are created by hackers or spammers that then try to create content on your site with that URL. For example, when a site is hacked, they will create a page on your site and then link to it.
Without the URL (or the page name without your domain name), it's tough for me to see what might be causing this. But, there has to be a link somewhere to it in order for Google to want to index it.
What I would do is use a server header check tool (such as http://www.rexswain.com/httpview.html) to see if the page has a "200 OK" server response or a 404 error. Google typically doesn't index pages that deliver 404 errors. It could be that the server is set up to deliver a "page not found" on your site but it comes up with a "200 OK" in the server header, so Google indexes the page.
Check your site to see if there is a link to the page. If the link exists, then fix it. Then, look at Majestic.com or Open Site Explorer to see if they show any links from other sites to the page. If those links exist, see if you can get rid of those links.
-
RE: Google is putting brandname: in title tag
Donnleath, I wouldn't worry too much about this. In fact, Google has been rewriting title tags for about 3 or 4 years, now.
Google, for whatever reason, has decided to rewrite your title tag for you if the one you're using better matches the actual search query being used by the searcher. It's quite possible that if your home page, for example, ranks for one search query they will rewrite it but for another search query they won't rewrite it.
Google will use, from time to time, your brand name, company name, and even elements of your site's navigation to rewrite your title tag. I've seen them take internal links from a site's navigation and breadcrumb trail and use those words in the title tag.
In your case, what you need to decide is if the title tags that Google is rewriting are better or 'worse' for your searchers. If they're worse, then you might consider looking at your site's navigation and breadcrumb trail to see if there's something that you can fix on your site to maybe influence Google to rewrite them another way.
If you see that Google is rewriting ALL of those title tags, though, on all pages, then you might want to take a look at your site's title tags and see if they do need to be rewritten, taking what Google is suggesting, into account.
-
RE: Hotel SEO / Rank Conundrum
Meisha, this can definitely be frustrating. When it comes to local listings, and individual units, keep in mind that every unit should have it's own unique unit number, so it would have it's own address.
You mentioned this: "He has also taken ownership of the building Google Plus page, Facebook page, etc. He only owns a handful of units in the building. "
If that other person has taken ownership of the entire building essentially, and the entire Google Plus page, Facebook page, etc. then is sounds as if he is misrepresenting his ownership. Therefore, pressure can be put on him to disclose his ownership of only certain units in the building, and you should be able to force him (legally) to only represent the units that he actually owns.
If this is the case, then he would need to update his Google Local listing(s) so that they only show the actual address of the unit(s) that he owns. If it doesn't currently, and it shows that he owns the entire building, then he should be forced to update it.
You should consult a lawyer, but most likely a stern letter to him asking him to update the website, Google Listings, and any Facebook (and other URLs) so that they only show the unit numbers he actually owns would probably go a long way. In the meantime, any listings that you create should also reflect the actual units that you own, as well.
When it comes go Google's local listings, it's perfectly fine to have multiple "businesses" at one location, as they have unique suite numbers. In this case, there are individual unit numbers, so there is an option to create a listing for each unit. It's not okay for this other person to misrepresent his ownership.
-
RE: Mass uploading low quality product pages
Becky, if you are aware that you have a lot of content that's going to be duplicate, then you've already identified the first step--which is to recognize that they are duplicates. Too many people just upload those pages and don't realize that they're duplicates and then wonder, after the fact, why their site's traffic went down. So for that, I commend you.
In order to deal with this, though, you need to determine which pages are truly going to be duplicates of other pages. Once you've done that, then you should use the canonical tag. The canonical tag should be placed on the duplicate pages and point back to the main page (or the one you don't want to be marked as duplicate).
Come up with a strategic, realistic plan for making those pages unique by adding or rewriting the content. And you might want to look at information such as your site's analytics or make a list of your best-selling products and deal with those first.
Adding a noindex tag to pages and removing them from the index really shouldn't be an option, because you DO want those pages indexed--adding content to them will make them unique and you'll be able to remove the canonical tag. Once you mark a page and tell the search engines not to index that page, it's much tougher to get it BACK in the index, so I wouldn't do that.
-
RE: Sitemap Size effect SEO
Spacecollective, your website's sitemap file(s) don't have any direct impact on search engine rankings. If you have a large website with over 4500 URLs, then most likely you're using content management system (CMS) that "should" be able to create a sitemap file.
If your website's CMS can't do that, then I would recommend crawling the website yourself and updating the sitemap file. However, keep in mind that if you do it manually then you'll need to update it whenever you add or remove a page on the website.
Typically, if your navigational structure is set up in a way that all pages on the website can be crawled via links on the site, you generally shouldn't have anything to worry about.
-
RE: Two companies merging into a new website. How to merge two existing websites into a brand new website and preserve search rankings.
Roy, this is definitely a complex task--which should take careful planning and organization. The steps that are outlined in the link that you provided is a good start, but that's only a small part of what needs to be done .There are a lot of sub-tasks that need to be taken care of in between those larger tasks.
When it comes to moving site A to B, there is no site C involved--so just think about it as if you're moving site A to C and then B to C. Or, you could also first think about combining both sites and rather than moving site A to B you can choose the best content on each and then just move them to site C.
What's important, though, is to figure out which content and pages are duplicated on both sites and then choose the best page(s) and move those to site C. There will be content that's essentially not on both sites, so those can just be moved. The key is to spend plenty of time organizing the content and deciding which content can go away, which needs to be moved, which needs to be combined, and soforth.
There is one major step that's missing in that other list, which is to use verify all sites (http and https, as well as http://www and https://www) in Google Search Console, set up those 301 redirects, and use the Google Change of Address tool to tell Google that the site's moved.
There is also a mention of rel canonical, and since the sites are moving entirely, canonical tags won't be appropriate to use. You'll need to use 301 Permanend Redirects to move the content from one site to another, especially since site A and B won't exist anymore (they'll be redirected).
-
RE: Switching from Http to Https, but what about images and image link juice?
Shawn124, whenever you move from HTTP to HTTPs, you'll need to set up the 301 permanent redirects for pages on the site only. The other elements, such as images, JavaScript (if they're external files), and .CSS files will need to be changed only in the code so that they reference the new HTTPs URLs, and not HTTP.
If you load an HTTP element (such as an image that uses the full URL in it's reference rather than the image filename only) on an HTTPs URL, then the browser will give you an error. So generally you need to do two things:
-
set up 301 Permanent Redirect for the page URLs.
-
search the entire website for all references to HTTP and change them to HTTPs (unless you're linking out to an external site).
If the site is in WordPress, you can use the Search and Replace plugin to replace it all at once in the database.
-
-
RE: Glossary Page - best practice
Brian, yes, this is the best practice. The canonical tag is essentially telling the search engines that the letter page is a duplicate of what's on the other page. So, they should give the credit to the other page.
Technically speaking, those letter pages are crawled by the search engines, but since they canonical tag is there the page is not indexed.
Again, this is the best practice if you're going to have the content appear in more than one location. Ideally, I would probably split it up into separate pages (a page for each term) if you can write enough content for each term to have it's own page. But, given the scenario you're outlining, this is most likely the best practice for your site. I'm asuming that the letter pages are clickable in your site's navigation and that users can click on them easily.
-
RE: Looking at google shopping results from other country
Dieter, one other way you can do this is to use a proxy service. There are several proxy services out there, such as SurfEasy, Strong VPN, or Hola that allow you to surf the web as if you're actually surfing from another country. You can choose the country and then you'll use a VPN to access the sites you need to from another country.
-
RE: SERPs started showing the incorrect date next to my pages
smmour, we've actually noticed this as well, this past week. One site in particular that I'm familiar with shows a date from February 2012 on the site's home page even though the Google cache date shows that the page was cached just the other day.
Google typically does take the pub-date from a site and uses that typically, especially if it's in the code of a site using WordPress. However, what you're describing sounds more of a Google problem than a problem with your site in particular. Based on the fact that we've noticed this as well, this past week, it appears to be something that you haven't necessarily done.
What intrigues me is the fact that the domain name wasn't registered and the site wasn't live in 2010, the date that it is showing.
-
RE: Need advice: How to replace a high-ranking pdf with a landing page -- without dropping much in rank?
John, that's a good question. Depending on the competitiveness of the keyword, I would hesitate to set up a 301 redirect--you still may see ranking changes if you replaced it with a landing page.
I suspect that the reason why it's ranking is because other sites are linking directly to the PDF file, the content. If you were to remove that PDF, they might stop linking to it.
One option would be to edit the PDF file and make that PDF the landing page (in .pdf format), that would be the same URL.
-
RE: Would you recommend changing image file names retroactively?
Generally speaking, in our experience, it's not worth it to take the time to change filenames (and thus change URLs) unless you absolutely have to. If you're still using the same CMS, then just changing the URL might not be worth it, as you have to take time to set up a 301 redirect from the old URL to the new URL.
What it would be worth it to do is to go with a flat file structure, such as domain.com/category/page or domain.com/page/. In the long run, you won't generally have to change URLs in the future if you move to another CMS.
If those pages are ranking well with the current URL, you may not want to change the filename. But, if they aren't ranking on the first page, it should be fine to change the filename. Don't forget to set up 301 redirects from the old URL to the new URL.
-
RE: Is having a site map page necessary?
Myles92, recently (in the past few months, I don't recall specifically when) Google did give some recommendations that included having an html sitemap page on your website. For a good user experience, it is recommended that you have a good navigation structure as well as an "html sitemap". The html sitemap page allows users to see the overall structure of the website, and click through to a certain page or section of the site.
-
RE: Need some strategy advice for Real Estate Attorneys in competitve locations
Donald, when it comes to local SEO, we typically recommend making sure that you first have to local citations taken care of--and then focusing on the content on the website. The Name, Address, and Phone (NAP) should be consistent across the board.
You can manage the local citations, submissions, fixing of duplicates, etc. yourself manually, or you can use one of the services out there, such as Moz Local, Yext, Advice Local, etc..
Then, as far the on-site issues are concerned, schema.org markup is a must on the site, as well.
It sounds as if you are already handling the content on the site, so continue with that. But the missing link is the local citations and local listings, which sounds like the missing piece here.
-
RE: How do I mprove site visibility and keyword ranking for new product site
Sharon, the site site looks great--I haven't taken the time to go through it like I normally would during an SEO audit of the site, but since you mentioned there were only a few issues that Moz had indicated, then it sounds okay. I would take care of the issues that they point out, though.
I took some time to look at your site's backlinks, though, and noticed that while you have a few, it just isn't nearly enough to make a difference. A site in your industry would have a lot more links--and looking you'll need to look at your competitors to see that they have a lot more.
I would spend some time working on your site's links, as I believe that's going to be the issue you're having, especially given the topic of your site.
-
RE: Disavow links from legit sites but have spammy link profiles?
Godard, that's a good question. What I recommend is that you look at each link individually and determine if that link has been created naturally or in a spammy way. If it's a legitimate link, on a real business's website, then the link should be okay. But, if the link was created in order to try to manipulate Google's rankings then you should disavow it.
Many sites get links just because they're ranking well--and that's typical. It doesn't mean that you necessarily have to disavow them or try go get them removed.
If you are using Moz's Open Site Explorer to see the Domain Authority and Page Authority of the site linking to you, then that would be a way to judge the quality of the site linking to you. If the site linking to you is, in fact, very low quality because it doesn't have a lot of good links pointing to it, then the site's Domain Authority and Page Authority will be low.
When disavowing and working on getting links removed, take a look at Google's Webmaster Guidelines and keep them in mind--if the link wouldn't pass their guidelines, then you should disavow it.