Category: Technical SEO Issues
Discuss site health, structure, and other technical SEO issues.
-
Exclude mobile pages from non mobile Google serps
I believe the most efficient and easiest idea is to include user-agent detection and it will help visitor server the correct version of the website. This means when a visitor will be on a computer it will detect the user and show the normal website but if the visitor is visiting the website through mobile version it will automatically show the mobile version of the website.
| MoosaHemani0 -
Descriptions missing from rankings associated with Google Place pages.
Hi Michael, It really depends on how Google decides to display the local results which can vary based on the keyword being used. The most common way of displaying results is called the 7-Pack (see example) in which no description is used for the map listings. Occasionally, you will see a local search result that does include a description, but this usually only happens when the search phrase is specific to a single business (see example). So it really depends on Google and there isn't a way you can force them to show a description. It all comes down to the keyword you're ranking for. On the upside, Google Places pages tend to a get a lot of attention when they appear in organic search results. Here an interesting article by Dr. Pete about it. Please note that although some of the examples Dr. Pete uses do show descriptions, Google has since changed the way they typically display local search results. Hope this helps! Tim
| TimKelsey0 -
Are site wide links bad for web developers?
We've gotten new business from our signature links. Just use a nofollow tag and you'll be within guidelines. The link is there to get new customers. Not increase search rankings. It doesn't make sense to remove the link. Doing so would be a bad user experience.
| EvolveCreative0 -
Targeting US search traffic
If you are only targeting for US traffic then I would definitely set GWT to target the US. You will never be able to completely cut off foreign traffic and you shouldn't want to however if you want to increase the visits to the site you should build backlinks from US sites, blogs etc. and start using social networks and target Us nationals.
| MassivePrime0 -
Limit for words on a page?
I like long pages. I have one longish article that is followed by over 500 comments which makes for a REALLY long page. The comments bring in a huge amount of long tail traffic!
| MarieHaynes0 -
Putting blog excerpts in footer of every page?
On my site, I have a sidebar widget that links to 10 random pages on my site. I would love to have it link to 10 relevant pages on my site as EGOL suggests...It's on my list of things to program one day! I do feel that these are helpful to me. The main reason why I like this widget is because I think it helps me keep my pages in Google's index. My site has over 3000 pages and I think many of them would end up in the supplemental index if they were not regularly linked to internally like they are. Also, I have it set so that they link using my desired anchor text for the page. This may help somewhat in my SERPs for each page. Oh, and all of these links are followed on my site.
| MarieHaynes0 -
Prospective new client it by webspam looking for new resource
Thanks so much, Ben. You're right, a new domain is not an option. They're a big company. At this point, I'm not sure what to charge. This is going to take a lot of time and I'm not sure if I even want to get associated with it now. Seems like the old seo should do it and provide documentation it's done. Anyone find a service to do this?
| tcmktg0 -
Use webmaster tools "change of address" when doing rel=canonical
Well, rel=canonical will technically work fine, however in this case if you are migrating the entire site (presuming you have all pages on the old domain, going somewhere on the new domain, I would essentially setup 301 redirects from the old domain to the new domain. I don't think you should do anything in the webmaster console. Just remove all the pages from the old domain and setup 301s from old domain to new domain. Google Webmaster Console Site migration IMO is for scenarios like you have yourdomain.net and you acquire the domain yourdomain.com and want to migrate the domain from .com to .net or if you change the brand name to yourbrandname.net or so on. Does that make sense ? I hope that helps.
| NakulGoyal0 -
Summarize your question.Crawl Diagnostics Summary
There are two ways to initial a crawl: 1. There is a weekly crawl associated with any sites for which you have set up a campaign. Each weekly crawl is fresh data. 2. You can manually initiate a crawl by going into your SEOmoz tools and using the Web Crawl tool. Each time you manually start the Web Crawl tool it should be a unique crawl. If you feel you have made the proper changes and the crawl tool is still showing errors the most likely cause is either the changes were not saved on your web server, or the changes were not performed correctly. If you are confident the changes were made correctly, you can also contact the SEOmoz help desk (help@seomoz.org) to report a bug in the web crawl tool.
| RyanKent0 -
Does part of a keyword phrase need to be repeated in a sub folder?
Do not repeat it. The structure of web-design/price-cost-calculator is good and will get the keywords you want in the URL. On the other hand, if you lengthen it out and say the keyword again you'd be more likely to lose keywords in the URL because of how much of the URL google actually shows in the SERPs.
| jgower0 -
Copying my content
Regarding your last point I am sure I recently watched a WBF were Rand had mentioned this was a good way of getting backlinks, I did question this myself when I heard this.
| Paul780 -
Understanding Duplicate Titles in Wordpress
Hi hfranz, Totally frustrating to find pages like that, isn't it? If I read your response right, you can't find where those rogue "Page" pages are being linked to from, so you can't fix the problem? You could try and give it a crawl with Xenu (http://home.snafu.de/tilman/xenulink.html). It should be able to tell you which pages are linking to the rogues. Once the crawl has run (tell it you don't need a report) right click on the links you want to dig into, and select "URL Properties". At the bottom of the window that pops up, it will say "# Pages Linking To This One" and give you a list of the URLs that link to that page. e.g. find www.example.com/blog/page/13/ on the results, right click > URL Properties and then look for the pages that link to it. If you're having trouble seeing the link visually, then view source and ctrl+F (or the Mac equivalent) to find it. It's a place to start tracking down the problems, at least. Someone more familiar with Yoast will have to chime in on how to fix it (if it's a Yoast problem) but you could always just remove the links to the rogue pages, since they don't sound critical. Hope this helps!
| BedeFahey0 -
NoIndex/NoFollow pages showing up when doing a Google search using "Site:" parameter
Thanks, appreciate you taking the time to write out a response!
| ChrisRoberts-MTI0 -
Are building a page using HTML 5 better for seo?
Hey Slava That's kind of what I thought as well. Using the latest standards compliant code could be an indication that a site owner is serious about the site being a) up to date from a technology viewpoint and b) accessilble to as many devices as possible. Cheers Jim
| jimpannell0 -
Severe Drops in Google UK rankings
I'd definatly avoid having unrelated keywords on a page, but if it's variations I would consider how much traffic each one gets, and either focus or make an extra page (presuming I have good, non-suplicate info for said page). We've actualy had a lot of success recently by just targeting the most prominant in the titles/h1s, and having variations in h2's, body text, with a few quality links to each (most of the links are for brandname on the homepage). It looks a bit more natural that way. Just one thing to check is 'when' your rankings dropped. If it was all in the same week/day, you might be being affected by the new panda/penguin e.t.c. updates, in which case there's been a lot of actional advice on the blog.
| My-Favourite-Holiday-Cottages0 -
Organic traffic down after 301 internal linkstructure
A small amount of weight / authority will be lost when changing the URL's and 301 redirecting to the new locations. It's only fractionable but could be the difference between 3 or 4 positions in the search results within your niche. It's possible there could be an issue with the 3rd party plugin you used, they may have overlooked something critical in this process. When did you notice the drop in rankings / traffic? Is it possible it could also be due to Panda/Penguin preparation/updates also?
| zigojacko0 -
Slide show showing up as video in serp
So - i don't think this site has a video sitemap - though that could be helping. A lot of videos are implemented in flash and Google have recently begun crawling more and more flash files and indexing them as videos. Google can't see past the flash code - which includes <param name="movie" indicating="" that="" the="" file="" is="" essentially="" a="" video.<="" span=""> Whether the content is a slideshow rather than a video is more a semantic than a technical issue - as it's technically identical to a normal flash video. The video rich snippet it receives are essentially relevant. If you have lots of flash image sequences implemented across sites - then i would encourage you to treat them as videos and use the movie schema markup- as well as creating and submitting a video sitemap. You won't always get the video rich snippet results for the content - but it can't hurt to provide the engines with more metadata and information.
| PhilNottingham0