Be sure you follow the best practices outlined here for separate mobile sites. In short, you want the desktop pages to have a rel alternate tag pointing at the mobile equivalent, and the mobile pages having their rel canonical pointing at the desktop equivalents.
Best posts made by MichaelC-15022
-
RE: Do mobile and desktop sites that pull content from the same source count as duplicate content?
-
RE: How does a collapsed section affect on page SEO?
Hi Stephan,
Presuming the expand/collapse thing is done properly, it should be golden. You'll find a lot of sites use this approach when they have multiple pages of content, e.g. a product page with specifications, reviews, technical details, etc.
I do this on my travel website. A great way to test to see if the initially-collapsed content is being seen and indexed by Google is to take a block of text from the collapsed section and search for it in double-quotes.
Here's an example: search for "At the Bora Bora Pearl Beach Resort you can discover the sparkling magic of the lagoon". You'll find my site there at #3 (Visual Itineraries), along with the other 1000 websites who've also copied the resort's description straight from the resort's website (yeah, I really shouldn't do this). So much for Google's duplicate content detection when it comes to text chunks...BUT I DIGRESS. That content you see is on the More Info tab.
Now, on to what "done properly" means:
- each tab should be in a separate div
- assign all divs a class which has style="display:none;" EXCEPT the currently selected tab
- have onclick handlers for the tabs that set all of the divs' classes to the display:none class, and then set the newly selected tab's div class to one with display:block or display:inline
And not done properly would mean something like changing the text of a div with Javascript onclick()....because Google won't see that text in the Javascript. It's got to be in the HTML.
That's about it. Not so tricky, really. And works well both for usability (no roundtrip to the server, not even an Ajax fetch!) and for SEO (lotsa yummy content on a single page for Panda).
-
RE: Why are these sites outranking me?
The social interaction counts are going to affect personalized results a lot more than depersonalized results (although this may be changing in the near future....see Eric Enge's post about this).
I'd say your backlink profile stats are right in line with the other 2 sites you mention. I'd say the differences in DA, PA, and RDs linking are really negligible. Your site and those 2 are all similarly tuned for the target phrase, from page title to URL to amount of content on the page.
You might try increasing the amount of unique content on your page. Big, original images, maybe shoot a little intro video of yourself talking about your game, and embed that. And crank up the total text on the page to over 2000 words. (See this study.)
-
RE: Google reconsideration nightmare
From the comment "show more efforts", I'd say you'll want to show not just more success at removing links, but how many times you contacted each webmaster and how.
I've had experiences with a couple of clients where the kinds of links that kept getting pointed out by the Google spam team tended to be article marketing examples, where the pages linking to my client's site were not in the WMT links, not in OSE, etc.....far too weak. So you're not alone there.
I would advise looking at all the examples you can find of any article marketing that was done for your site, then try to find all related pages...i.e., don't JUST try to remove the examples they pointed out. In other words, if you find there's someone named "Andy Smith" authoring some of the article marketing posts they've pointed out, then do a Google search for "Andy Smith" and your brand name to try to find any other article this person wrote for you. In my case, I was able to find quite a collection of pages in the Google Index (not even supplemental...the regular index!) that weren't in the WMT links nor in OSE etc. Also, take a big block of text from the start of each article and search for that in double-quotes, to see if it was posted elsewhere under a different name.
Then, chase these down, try and get them taken down, ping the webmaster 3-4x each, then disavow them and submit your reinclusion request.
-
RE: Miriam's 7 Local SEO Predictions for 2019
Great predictions, Miriam!
I'll add one more...maybe it's more of a wish than a prediction...that Google will make some sort of serious strides towards cracking down on fake reviews (both positive and negative). Hopefully not as over-the-top as Yelp's approach (which throws a lot of babies out with the bathwater!) though.
-
RE: Are there any negative side effects of having millions of URLs on your site?
I'll echo Robert's concern about duplicate content. If those facet combinations are creating many pages with very similar content, that could be an issue for you.
If, let's say, there are 100 facet combinations that create essentially the same basic page content, then consider taking facet elements that do NOT substantially change the page content, and use rel=canonical to tell Google that those are all really the same page. For instance, let's say one of the facets is packaging size, and product X comes in boxes of 1, 10, 100, or 500 units. Let's say another facet is color, and it comes in blue, green, or red. Let's say the URLs for these look like this:
www.mysite.com/product.php?pid=12345&color=blue&pkgsize=1
www.mysite.com/product.php?pid=12345&color=green&pkgsize=10
www.mysite.com/product.php?pid=12345&color=red&pkgsize=100
You would want to set the rel=canonical on all of these to:
www.mysite.com/product.php?pid=12345
Be sure that your XML sitemap, your on-page meta robots, and your rel=canonicals are all in agreement. In other words, if a page has meta robots "noindex,follow", it should NOT show up in your XML sitemap. If the pages above have their rel=canonicals set as described, then your sitemap should contain www.mysite.com/product.php?pid=12345 and NONE of the three example URLs with the color and pkgsize parameters above.
-
RE: Organic search traffic dropped 40% - what am I missing?
Possibilities:
- The layout of the product pages for the new shopping cart is pissing off Panda. If that's the case, the traffic to the home page shouldn't have changed much, but the product pages will have dropped.
- Panda now sees the pages in general as having less content than before, perhaps images aren't getting loaded in the pages in such a way that Google sees them whereas they were before, something like that....and Panda now thinks the entire site is less rich in content.
- It often seems to take Google a month or so to "settle out" all of the link juice flows when you do a bunch of redirects, have new URLs, etc. I would expect that the link juice calculation is iterative, and that would be why it would take a number of iterations of the PageRank calculation in order for entirely new URLs to "get" all the link juice they should have.
- Their backlinks were moderately dependent upon a set of link networks, and those link networks have shut down all their sites (so that neither Google nor Bing still see the links from them).
Those are the ideas that come to mind so far.
-
RE: Query results being indexed and providing no value to real estate website - best course of action?
Ideally, you'd set the meta robots in that page to noindex,follow. This will allow link juice to flow from all of those pages to the pages in your main navigation as well as removing them from the index.
If you cannot modify the section of those pages, then, at a minimum, you could tell Webmaster Tools to ignore the pre and start parameters (specify that the parameter merely sorts the data on the page). Then, you'd end up with just 1 page indexed per city, which is probably a lot better than where you are now.
-
RE: Homepage ranks worse than subpages
I'd agree with Monica. Panda's above-the-fold algo is absolutely going to slay your home page. You've got only 1 sentence of content above the fold. Your images in the slider are all clickable (except the Lego image), and besides, they don't seem to be foreground images (except the Lego image)...Panda is likely going to see them as decoration.
Your video is probably not seen as video. I see no schema.org/VideoObject markup, and it doesn't seem to be one of the standard embeds (YouTube, Vimeo, Wistia) that Panda can likely recognize in the HTML.
Everything else on the page is clickable, which (this is my theory only) is likely to cause Panda to see it as navigation....not content.
So....I'd recommend:
- chucking your current slider; choose a different plugin (or write it from scratch, it's only a couple dozen lines of Javascript and a few links of plain old boring HTML), so that the images are seen as content AND they're not clickable, except the next/prev slide buttons
- redesign your layout to pull some of the text below up above the fold, including moving the Archive & Communicate section and its siblings above the giant buttons
- use Wistia to embed the video, and follow their instructions re: creation of a video sitemap
I'd also recommend going into Google Webmaster Tools, and doing a Fetch & Render on your home page, to make sure that Google is able to see your page laid out the way you expect.
-
RE: Client wants to rebrand but insists on keeping their old website live as well...
I'll second Miriam's points, above. There's substantial risk here if both sites are going to be visible to Google.
I'd block the old site in in robots.txt permanently. I'd never redirect the old site to the new, even if cleanup had been done. From the penalty recovery work I've done, it sure feels like Google keeps some sort of permanent flag on your site, even after you've done the cleanup. New, good links don't seem to have as much effect as you'd expect.
For the new site, spend the $$ and do some PR/outreach and some solid, strong links in addition to the core directory links you get via MozLocal. Do some community service work that gets a press mention; offer a scholarship to dentistry students from a specific school, so that the school will link to your scholarship page. A few really good links from newspaper stories will work wonders for getting the new site to rank, both in the 3-pack and in regular organic.
-
RE: Homepage ranks worse than subpages
I would agree IF the video is something that people would search for, e.g. big branding content, humor, viral content, music or celebrity related.
Otherwise you're just generating traffic for YouTube. Conversions from visitors to your YouTube page for your company to clicks back to your website will run around 1%.
And the YouTube page for your video is likely to outrank your own page that embeds that video.
Phil Nottingham from Distilled is the master of the universe when it comes to video SEO...see his writeup here where he talks about the pros and cons.
-
RE: SEO Impact of High Volume Vertical and Horizontal Internal Linking
No, keep doing it the way you're doing it. That's perfectly good link juice flowing between those pages.
Breadcrumbs are a nice way to communicate the hierarchy to Google--not because they're breadcrumbs, but simply because of their nature: all pages at each level contribute link juice back up to each of its ancestor pages. A child page has the least internal links; its parent has more; its grandparent even more; etc.
-
RE: .org appearing in browser search when .com is the primary domain
I'll 2nd what Jim said. Also, if you're looking at search RESULTS (not Google Suggest), then after you run your query, append &pws=0 to the URL to depersonalize the results....then see which is chosen.
If you have the .org pages 301 redirected to the .com, then you're in good shape regardless. You should use the httpfox Firefox plugin to watch the HTTP response stream when you type [yourdomain].org to make sure it's a clean 301 redirect to the .com version.
-
RE: Advice on Security Concerns for WordPress Site
By the way, if your site has ANY visibility at all, you can expect an automated attack on the WP admin login maybe once an hour. All day, every day. So, if you leave your admin account active, and give it a password that's in the dictionary, you'll be seeing injection scripts in your home page and you'll be the proud assistant distributor of various pharma products within a month or two :-).
Right now, I'm seeing so many scripted attacks from China, I blocked all China IP addresses from my travel website. I was seeing hundreds to thousands of attempts per day before that.
-
RE: Best Google Practice for Hacked SIte: Shift Servers/IP or Disavow?
OK, so they're scraping much of your site, and then adding in their own garbage etc.
I wouldn't worry about the occasional instance of this, unless you do see a penalty. For the more egregious ones, where they're building a ton of links, I'd throw their domain in your disavow list.
-
RE: What if a 301 redirect is removed?
#1: no. Once both pages have been recrawled, and maybe a month has gone by to "settle" out the link juice, they'll be independent pages again.
Now, having said that, it's very possible that once you've 301'd a URL, it's going to be very low on the crawl priority, as the 301 TOLD Google that the redirect was permanent. But eventually it'll recrawl it. You can force it in WMT with a Fetch as Googlebot + Submit URL.
When Google appears to have the memory of an elephant regarding links, the circumstances are usually something like this:
- Google crawls the URL and gets Good Stuff.
- Then, the URL goes away (404s or 500s).
- Google is hoping to see that lovely lost URL come back, and even if it no longer finds links to that URL (internal or external), it will continue to try to refetch that for quite some time (months, it seems). Ditto Bingbot, btw.
- In the absence of new info (the page simply is missing or broken), Google will keep its cache of what was on the page, show it in the SERPs, and retain link metrics from it to other pages....for a LONG time.
I've seen no evidence at all that Google has a "memory" for past link juice and transfers that juice the way you've described. However, it seems clear that the folks at Google DO have the ability to look at link history manually, through their tools....for instance, to evaluate changes in backlinks for penalty reconsideration.
-
RE: Google keywords
There are a number of things that Google looks at to determine how relevant your page is to a certain keyword phrase:
- presence of keyword phrase in page title
- presence of keyword phrase in URL
- presence of keyword phrase in domain
- presence of keyword phrase in body text
- presence of keyword phrase in image ALT text and image filenames on the page
- presence of keyword phrase in both internal and external links to the page
Meta description isn't really considered by Google in terms of relevance/ranking, but of course it's what's shown to the user in the search results as the "snippet" from the page, below the page title--so it can affect your conversion (from showing in search results to clicks through to your site).
Typically, you'll want to identify a primary target keyword phrase for each page, and make sure your page title STARTS with that phrase; make that phrase be part of your URL (after removing punctuation and special characters, and replacing spaces with hyphens), make your H1 heading on the page contain that phrase, have the phrase appear a couple of times on the page in the body text, and have an image or two that has the phrase in its ALT text and also the image filename.
Page title is probably the most important here (assuming your domain name isn't an exact match for the keyword--that tends to be REALLY strong still).
-
RE: Are bad links the reason for not ranking?
I'd agree with Keri on this one. If you want to start with a fairly inexpensive investment to see how bad the problem is, buy a subscription to Link Research Tools and run a Link Detox report on your site. If you get a pretty bad result, then go to the Moz recommended list to find someone with experience identifying and getting you out of penalties.
-
RE: Will I mess with Authorship if I setup multiple client websites under my Webmaster tools login?
You should be just fine with G+ authorship as long as in the clients' pages you have the rel=author links in there.
Yes, there are alternate ways of establishing G+ authorship, but it's my understanding that the explicit 2-way links are the core means of identifying the author-document relationship, and that the alternate methods are just that.
I think it's pretty safe to verify these using Google's own structured data test tool. Yes, it has a few bugs, and fails to read some websites, but it's overall pretty reliable, and I would expect that it's using a very similar bit of logic to determine authorship that the SERPs-producing code is using.
-
RE: Should I disavow local citation page links?
I wouldn't sweat it. There are a jillion 3rd-tier business listing directories out there that are pulling that sort of data from the major directories. Yes, it's an issue if ALL you have is super weak links, but you'll need to be doing outreach for link-building anyway so that should not be a big deal.
I'd only disavow links that are actual spam. Not weak but legitimate links.