Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Google has no problem crawling javascript as long as the files are not blocked in your robots.txt file. See this blog post for more information about what can and cannot be crawled in JS: http://googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html

    | Giovatto
    0

  • OK there is no general issue,just search for "visueller yoga guide" so i guess google is changing the title that it fits better, search for a term in the title and its the right one. www.google.de/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=visueller yoga guide ^^ es gibt keinen generellen Fehler, vermutlich überschreibt Google den Title, dass es besser passt, suche einfach mal nach "visuellem Yoga guide"

    | paints-n-design
    0

  • As the parent company, I would want any individual work done to benefit the main site and as a store owner, I would want to have some benefit from the parent company as well. There need to be policies in place to prevent bad practices, but it should be all stores and the parent company helping each other. If they want to be treated as a separate entity then they should probably build a website themselves on a separate domain. Most major retailers have the stores in subfolders. Go check walmart.com as an example.

    | TheeDigital
    0

  • Hi Matt-Antonino, Great - thank you. Ok, I'm glad it's not going to be a problem! Thanks again, J

    | JamesPearce
    0

  • Hi, Dom, maybe you find something here: http://moz.com/community/q/does-anyone-know-of-any-tools-that-can-help-split-up-xml-sitemap-to-make-it-more-efficient-and-better-for-seo Splitting a sitemap can be useful for you to keep sight over the indexed url's.  But it's quit normal to run a sitemap with 5000 url's. the maximum is 50.000 url's for 1 sitemap, so it's not necessary. Grtz, Leonie

    | Leonie-Kramer
    0

  • Yep. 301 = permanent redirect = this page used to be here but not moved to this new location so lets pretend its always been this way (and transfer any authority to it).

    | OlegKorneitchouk
    0

  • Wow. I did some research. I stand corrected. Thanks, Linda. As far as your categories go, you could have: www.Domain.com/computers/notebooks/apple-notebooks/ and www.domain.com/apple-products/ On your category pages, I'd suggest adding unique content at the bottom of the category pages. A paragraph above the fold would help for ranking purposes, but may detract from usability and conversions.

    | AMHC
    0

  • great thanks ill give that a go

    | juun
    0

  • Then you have nothing to worry about as far as being delisted. I doubt the server issue would matter, but do make sure your website loads as fast as possible. That is one factor that influences search results. But suffice to say the traffic sways up and down based on lots of factors from Google along with things like the day of the week, seasons of the year, holidays, etc. In my opinion you have nothing to worry about outside of simply steadily building your website traffic.

    | Patrick_G
    0

  • Yes, search engines definitely know that subdomains are hooked to your domain, and there's evidence that search engines will count links toward a subdomain to your domain. However, there's speculation that those links are slightly discounted in the level of authority they pass -- they are a stronger signal to your subdomain -- and it's still best practice for your SEO to put your blog on a subfolder instead of a subdomain.

    | EricaMcGillivray
    0

  • What I do when I want to get an idea of how frequently Google crawls a page is I look at when it was last crawled. If the cached date was a long time ago, Google probably doesn't crawl it that often. If it was recently cached, it could mean a more frequent crawl—but it also might be that I just caught it at the right time. So I look at a few similar pages to see if they agree. (To see when a page was cached, do a search on the URL of the page in question—just put the URL right in the search box. In the results, look next to the green URL in the result which is the page you searched for and there is a little green triangle. Click that, and you will see "cached." Choose that, and it will bring up the version of the page that Google has cached, along with the date it was cached.) Don't worry too much. Even without your fixes, Google will figure out the situation on its own and start showing a preferred URL anyway. But yes, it is generally a good choice to show yourself in the best light and follow best practices to make things as easy as possible for Google.

    | Linda-Vassily
    0

  • Hi Kristina, I just responded to you on the other (related) issue at http://moz.com/community/q/mobile-search-results-show-desktop-crawled-content#reply_270529  before seeing your response here.  Sorry about that. Thanks for your response.  It really hadn't occurred to me to allow mobile users to use those pages because the experience would be pretty bad for the most part, and a lot slower for those pages, but it definitely is a thought for solving my indexing issue.  I could give the mobile user a popup the first time they venture into the 'unmobile-friendly pages' as a heads up about which parts are mobile friendly and which aren't...I wonder though if Google will penalize the site or those pages in some way since it will not be able to render them as mobile friendly for a mobile device..EDIT: just saw this: http://www.cnet.com/news/make-web-sites-mobile-friendly-or-face-google-search-wrath/....anyway,  interesting alternative, It wouldn't be as easy as it sounds because of the degree of customization in menus I've done for the mobile pages, but may well be a lot easier than other alternatives..  Thanks very much! Ted

    | friendoffood
    0

  • Thanks for your reply, but the link you pointed me to isn't my situation.  I'm not redirecting to separate urls.   Mine is this: https://developers.google.com/webmasters/mobile-sites/mobile-seo/configurations/dynamic-serving I HAVE to figure out what Google is doing in my situation because if I don't and I assume wrong, then lots of pages for either my desktop or mobile friendly  won't be indexed. Surely lots of website owners have had their developers create a minimal mobile friendly site with less content and pages than desktop users get and chosen the dynamic-serving approach, but I have yet to receive a reply from anyone who has faced that issue..It's a very serious issue for me because either I have to consider dumping the dynamic serving in favor of separate mobile urls (if that would work), or I have to do a ton of programming to add in content so that all the urls have both mobile and desktop content.

    | friendoffood
    0

  • Some years ago people used to make this claim about W3C validation badges too. A badge is a badge is a badge. It really is unlikely to affect your search rankings one way or another.

    | AlexMcKee
    0

  • Personally, i try to avoid this approach and focus on quality over quantity. A single, well written and shared piece of content has the potential to earn many times more links and traffic than 50 purchased or hastily written articles when the quality is in question. Really, I don't care about how many posts I've written or how many comments I've made (and by directory submissions I hope you mean Local Directories What I do care about is how users engage with those activities, how often my content is shared by authoritative influencers, and how these activities contribute to SEO success. Sticking to a structure and/or schedule is important, but I would avoid the cookie-cutter approach to SEO. A couple presentations by Rand that may be helpful. Why Content Marketing Fails Building a Marketing Flywheel Hope that helps! Best of luck with your SEO.

    | Cyrus-Shepard
    0

  • Hi AMHC, It makes sense that without hardly any backlinks built up Google wont find my upper case URLS since all the page links have been changed, however, I am writing out all of the urls that are redirected into email, and from that I can tell that Google is finding them--I guess they may have a list of urls from prior indexing that they crawl independent of what their crawler comes up with. I'll keep looking to see what they have indexed and if it turns out they just aren't crawling certain pages, will put them in a sitemap to be crawled..It's a good idea for taking care of the problem quickly--so if it progresses too slowly I'll do that. Thanks very much for your answers!

    | friendoffood
    0

  • http://moz.com/learn/seo/canonicalization "Another option for dealing with duplicate content is to utilize the rel=canonical tag. The rel=canonical tag passes the same amount of link juice (ranking power) as a 301 redirect, and often takes much less development time to implement."

    | Linda-Vassily
    0

  • I would update the canonical tag on your end to reflect that Page A (that's being redirected to Page B) is no longer the canonical/preferred URL. Add rel="canonical" href="http://domain.com/page-b" to the old & the new page. I would also send the new tag to the 3rd party with something like 'Hi there- I know you're all super busy, so we thought sharing the new canonical tag with you might help get things updated more quickly' - or something to that effect.

    | Sheena_Schleicher
    0