Category: Web Design
Talk through the latest in web design and development trends.
-
Block parent folder in robot.txt, but not children
The idea from Andrew is nice, but my guess would be that you're targeting multiple events so that might run into issues. What you could do is add some more regular expression and make it like this: Disallow: ^/news/events-calendar/usa$
| Martijn_Scheijbeler0 -
Shutting down m. domain
If you are going to literally shut down your m. subdomain, then the best way to do it is to use 301 Permanent Redirects to redirect the traffic to the appropriate page on the main domain. I would NOT use canonical tags in this case.
| GlobeRunner0 -
Https pages indexed but all web pages are http - please can you offer some help?
You have a lot of questions in here. We are going to need to limit this thread to your main question of the https URLs being indexed. Can you share the domain? Have you claimed the https domain in Search Console yet to see if these indexed URLs are being shown in search results?
| katemorris0 -
Problems preventing Wordpress attachment pages from being indexed and from being seen as duplicate content.
Hi Kate, Here is an update as to what is happening so far. Please excuse the length of this message. The database according to the host is fine (please see below) but WordPress is still calling https: In the WP database wp-actions, http is definitely being called* All certificates are ok and SSL is not active* The WordPress database is returning properly* The WP database mechanics are ok* The WP config-file is not doing https returns, it is calling http correctly They said that the only other possibility could be one of the plugins causing the problem. But how can a plugin cause https problems?...I can see 50 different https pages indexed in Google. Bing has been checked and there are no https pages indexed there. All internal urls always have been http only and that is still the case. I have Google fetched the website pages and in the 50 https pages most are images which I think probably must have came from the Yoast sitemap which was originally submitted to the search engines (more recently though I have taken all media image url's out of the Yoast sitemap and put noindex, follow on all image attachments files (the pages and the images on the pages will still be crawled and indexed in Google and search engines, it just means that any image url's won't. What will happen to those unwanted https files though? If I place rel canonical links on the pages that matter will the https pages drop out of the index eventually? I just wish I could find what is causing it (analogy: best to fix a hole in a roof to stop having to use a bowl to catch the water each time it rains). ** I looked at analytics today and saw something really interesting (see attached image) - you can see 5 instances of the trailing slash at the home page and to my knowledge there should only be 1 for a website. The Moz Crawl shows just 1 home domain http://example.co.uk/ so I am somewhat confused. Google search results showed 256 results for https url references, and there were 50 available to click on. So perhaps there are 50 https pages being referenced for each trailing slash (could there be 4 other trailing slash duplicate pages indexed and how would I fix it if that is the case?). This might sound naive but I don't have the skillset to fix this at this time so any help and advice would be appreciated. Would Search and Replace plugin help at all or would it be a waste of time since the WordPress database mechanics seem to be ok. I can't place any https to http 301 redirects for the 50 https url's that are indexed in Google, and I can't add any https rewrite rules in htaccess since that type of redirect will only work if a SSL is active. I already tried several redirect rules in htaccess and as expected they wouldn't work which again would probably mean that the SSL is not active for the site. When https is entered instead of http, there should be an automatic resolve to http without me having to worry about that, but I tried again and the https version with a red diagonal line through it appears instead. The problem is that once a web visitor lands on that page they stay in that land of https (visually the main nav bar contents stretch across the page and the images and videos don't appear), and so the traffic will drop off..so hence a bad experience for the user and dropped traffic, decreasing income and bad for seo (split page juice, decreased rankings). There are no crawl errors in Google Search Console and Analytics shows Google Fetch completed for all pages - but when I request fetch and render for the home page it shows as partial instead of completed. I don't want to request any https url removals through Google and search engines - it's not recommended because Google states that http version could be removed as well as https. I did look at this last week: http://www.screamingfrog.co.uk/5-easy-steps-to-fix-secure-page-https-duplicate-content/ Do you think that the https urls are indexed because of links pointing to the site are using https? Perhaps most of the backlinks are https but the preferred setting in Webmaster Tools / Search Console is already set to the non-www version instead of the www version; there has never been a https version of the site. This was one possibility re duplicate content. Here are two pages and the listed duplicates: The first Moz crawl I ever requested came back with hundreds of duplicate errors and I have resolved this. Google crawl had not picked this up previously (so I figured everything had been ok) and it was only realised after that Moz crawl. So https links were seen to be indexed and so the goals are to stop the root cause of the problem and to fix the damage so that any https url's can drop off out of the serps and the index. I considered that the duplicate links in question might not be considered as true duplicates as such - it is actually just that the duplicate pages (these were page attachments created by WordPress for each image uploaded to the site) have no real content so the template elements outweighed the actual unique content elements which was flagging them as duplicates in the moz tool. So I thought that these were unlikely to hurt as they were not duplicates as such but they were indexed thin content. I did a content audit and tidy tidied things up as much as I could (blank pages and weak ones) hence the new recent sitemap submission and fetch to Google. I have already redirected all attachments to the parent page in Yoast, and removed all attachments from the Yoast sitemap and set all media content (in Yoast) to 'noindex, follow'. Naturally it's really important to eliminate the https problem before external backlinks link back to any of the unwanted https pages that are currently indexed. Luckily I haven't started any backlinking work yet, and any links I have posted in search land have all been http version. As I understand it, most server configurations should redirect by default to http when https isn’t configured, so I am confused as to where to take this especially as the host has given the WP database the all clear. It could be taxonomies related to the theme or a slider plugin as I have learned these past few weeks. Disallowing and deindexing those unwanted http URLs would be amazing since I have so far spent weeks already trying to get to the bottom of the problem. Ideally I understand from previous weeks that these 2 things would be very important: (1)301 redirects from http to https (the host in this case cannot enable this directly through their servers and I can only add these redirects in the htaccess file if there is an active SSL in place).(2)Have in place a canonical url using http for both the http and https variations. Both of those solutions might work on their own and if the 301 redirect can't work with the host then the canonical will fix it? I saw that I could just set a canonical with a fixed transport protocol of http:// - then Google will then sort out the rest. Not preferred from a crawl perspective but would suffice? (Even so I don't know how to put that in place). There are around 180 W3C validation errors. Would it help matters to get these fixed? Would this help to fix the problem do you know? The homepage renders with critical errors and a couple of warnings. The 907 Theme scores well for its concept and functionality but its SEO reviews aren't that great. Duplicate problems are not related to the W3 Total Cache plugin which is one of the plugins in place. Regarding addons (trailing slash): Example: http://domain.co.uk/events redirects to http://domain.co.uk/events/ the addon must only do it on active urls - even if it didn't there were no reports of / duplicate errors in the Moz Crawl so its a different issue that would need looking at separately I would think. At the bottom of each duplicate page there is an option for noindex. There are page sections and parallax sections that make up the home page, and each has to be published to become a live part of the home page. This isn't great for SEO I understand that because only the top page section is registered in Yoast as being the home page the other sections on the home page are not crawled as part of the home page but are instead separate page sections. Is it ok to index those page sections? If I noindex, follow them would that be good practice here. The theme does not auto block the page section from appearing in search engines. Can noindex only be put on whole pages and not the specific page sections? I just want to make sure that the content on all the pages (media and text) and page sections are crawlable. To ultimately fix the https problem re indexed pages out there could this eventually be a case of having to add SSL to the site just because there is no better way - just so the https to http redirect rule can be added to the htaccess file? If so, I don't think that would fix the root cause of the problem, but the root cause could be one of the plugins? Confused. With Canonical url's does that mean the https links that don't have canonicals will deindex eventually? Are the https links giving a 404 (I'm worried because normally 404's need 301's as you know and I can't put a 301 on a https url in this situation). Do I have to do set a canonical for every single page on the website because of the extent of the problem that has occurred? Nearly all of the traffic is being dropped after visiting the home page, and I can't for the life of me see why. Is it because of all these https pages? Once canonicals are in place how long will it take for everything to return to how it should be? Is it worthwhile starting a ppc campaign or should I wait until everything has calmed down on the site? Is this a case of setting the canonical URL and then the rest will sort itself out? (please see the screenshot attached regarding the 5 home pages that each have a trailing slash). This is the entire current situation. I understand this might not be so straight forward but I would really appreciate help as the site continues to drop traffic and income. Others will be able to learn from this string of questions and responses too. Thank you for reading this far and have a nice day. Kind Regards,
| SEOguy10 -
I can see https urls being indexed and they shouldn't be..
Thanks Logan. This is obviously something I haven't learned to do yet. But an SEO's knowledge is never complete hence me asking questions here in the forum. I think we all will have our simple and more complicated questions to ask. I appreciate your help and there is certainly more I will have to learn that's for sure.
| SEOguy10 -
In Wordpress getting marked as duplicate content for tags
Thanks that's what we ended up doing with Yoast.
| limited70 -
How do you influence the default site title?
It used to be that titles only came from two sources - the <title>tag and DMOZ. If the organic result title didn't match the title tag, then you'd check DMOZ. Unfortunately, now Google pulls data from all over the place, including Google+ listings and the Knowledge Graph. Google has become very interested in understanding brands as entities and is bringing a lot of data into play, sometimes poorly.</p></title>
| Dr-Pete0 -
Services\Companies that expertise to improve WP site speed ?
Thanks guys! and sorry for the late respond... Here are the website: http://www.bamboozz.net/ Please let me know what we can do to increase the speed. Thx
| EdmondHong870 -
Bing Indexation and handling of X-ROBOTS tag or AngularJS
Thank you for the update Kavit.
| Everett0 -
Fetch as Google not showing Waypoints.js on scroll animation
Googlebot certainly fetched. And it definitely rendered. But it didn't scroll, so the animation never took place. Barring anything unorthodox, the content is there - it's just waiting to be animated when the scroll event triggers the function. (onScrollInit). Here's more about animations and Waypoints.js. There's a pretty concrete example toward the bottom of the linked article. It will show you that, yes, the content is there in HTML form. It's just waiting to be displayed in whatever fancy way. The JavaScript approach could be problematic, in some instances. For some reason your JavaScript might not load on some sessions. Or perhaps someone will visit the site with JavaScript disabled in their browser. The former is more likely than the latter, for a number of reasons. Barring any concrete example, beyond the image (we need URLs!!! ), you can the check the live URL. You would do so using the cache: operator. Usage is as follows: cache:thesite.com/the-fancy-javascripts This will show you a cached example of the page. Should it be live, and unblocked by various robots (txt and/or meta), it may be cached by now. If you can see the animations firing on-scroll, the content is indexed. Though it's generally preferable to show all content as soon as possible, without much hand-waving (fancy javascript animations, etc.). Edit: I also forgot to mention one pretty critical thing. Make sure legit search engine bots have access to the site's CSS and JavaScript. If they don't, that will create problems as well.
| Travis_Bailey0 -
Pushstate and Infinite Scrolling Article Pages: Is it detrimental to not change URLs as the page is being scrolled?
The parallax or infinite scroll effect is going to cause some SEO issues since it is basically one webpage with a single url. This is going to make it difficult to optimize your site for a variety of search terms and could lead to keyword dilution since you are optimizing only a single page. Also, I would imagine an infinite scroll/parallax site would present some analytics issues as well. One issue I have noticed with some client accounts that I have put heatmaps on is no one scrolls below the fold. So I would imagine having a single infinite scroll site design you would see higher bounce rates. I have seen some sites, however, use their homepage as a parallax scrolling page with specific content pieces linked throughout the homepage. Google has provided some insight into infinite scroll pages on their official webmasters blog which can give you some more insight.
| JordanLowry1 -
Affects of a Home Link 301 Permanent Redirect in the Main Nav Bar
Hi Paul Thanks a lot. A main nav menu link is what I have and I used this code to create it, since the logo already provides a link in the nav bar also: // Filter wp_nav_menu() to add additional links and other output function new_nav_menu_items($items) { $homelink = ' [' . __('Home') . '](' . home_url( '/' ) . ')'; $items = $homelink . $items; return $items; } add_filter( 'wp_nav_menu_items', 'new_nav_menu_items' ); ?> I appreciate your answer
| SEOguy10 -
Traffic Dropping To Website
Hi Paul, Thanks for responding. Each menu item links to its final destination. This is the functions.php code I have used on the Wordpress site which basically allows a 'Home' text to be clickable so users can go to the home page when they are at any page on the site: // Filter wp_nav_menu() to add additional links and other output function new_nav_menu_items($items) { $homelink = ' [' . __('Home') . '](' . home_url( '/' ) . ')'; $items = $homelink . $items; return $items; } add_filter( 'wp_nav_menu_items', 'new_nav_menu_items' ); ?> I haven't yet set up set up any crawler spam and ghost referrer spam filters for your site and Analytics. I do see 2 referring websites in Analytics that are not familiar to me. Kind Regards.
| SEOguy10 -
Making a website menu + structure + hierarchies + kw research
As per my opinion here i never work for SEs. if i want rank high with my site then i keep in mind about good User experience. of course i will target keywords to rank my site well, i will add a content that can be useful and interesting to my targeted audience and build some natural links, and keep update my site. for your last question about website structure i have a good article here https://blog.kissmetrics.com/site-structure-enhance-seo/
| rootwaysinc0 -
Recommendations for top notch US based WP developers
Hi Rosemary! Thanks for posting to Q&A! Just FYI, though, we don't allow job listings here, which is really what this is. There's a great job board at Inbound.org if you're interested. I'm going to lock this thread to further responses. Thank you for understanding.
| MattRoney0 -
Multiple Similar Product Variations - Page layout, Title and SEO best practice??
Hi Craig, Thanx for the compliment. I talked to a develepor of Woothemes/ Woocomerce last week on a fair and he pointed me to: https://www.woothemes.com/storefront/ it promises to be the most flexible them up till now and your wish should definitely be possible with that. You can also contact them with your wishes and I am sure they can help you further. Best of luck with the Sunglasses :). Tymen
| Tymen0 -
Regarding rel=canonical on duplicate pages on a shopping site... some direction, please.
Thank you, Patrick and Kate!
| DavidCiti0 -
Pageless/Single Page Design and Migration Questions
Hi there, There is definitely a lot that goes into a site migration/redesign, and re-thinking the way your content is presented to the user is definitely one of them. So, before anything else, kudos for spotting that. In terms of your questions, there is a general rule for UX and IA in terms of organizing content and hierarchy on your site. That rule tends to be to make every important page on your site reachable by a user within three clicks. This is referred to as the "three-click rule". In other words, can a user who lands on the homepage get to the "contact" page within three clicks? How about the "how to apply" page? I tend to extend this rule to four clicks for specific verticals, and I think it would be appropriate to say that if you can get your user to any page within four clicks of the homepage on your specific site you are doing a great job. Also, Rand did a great Whiteboard Friday on this topic that you might want to check out. Here are some other thoughts to consider in terms of SEO: Having a long homepage with parallax might look good to some, but the more important thing to ask is "will users be able to find this content efficiently and quickly on all devices?" Site migration can break an entire sites SEO work very quickly. I would do a thorough read of this wonderful guide on migrating sites and make sure you are set on that front. Hope that is helpful!
| sergeystefoglo0 -
Side bar menu, good or bad idea.
The little box you talking about. Make a small table or div and float it to left. When you say duplicate menu at the bottom, do you mean main menu or sidebar menu. You can do either. Anything you want.
| EGOL0 -
Migrate from HTML to Wordpress?
I understand your fears completely because WP + themes + plugins can be HUGE mess. Why? Because some of devs don't know technical and on-page SEO. I have seen themes where they put text "Comments" within H1 tag. At same time post title was encapsulated with H3. Things as hidden text, messy HTML codes, bloatware HTML are countless. You also can get server overloading, slow SQL queries and some issues. Can be performance issues, PHP issues, hosting issues. Running WP mean that you will get 10 CSS files and 10 JS libraries. And this is "best case scenario". Just imagine what is worst. Now add "upgrading" procedure where everything can be broken (or changed!) with just one click. Ah, and mine favorite - security exploits and hacking. Sound like "perfect storm". Isn't? Now i know that this sound scary. That's why you should see and review HTML code of WP before migration. Probably you will see potential for improvements. And this changes need to be patched over original files (i'm talking about theme or plugins). For theme is OK - you can make child theme based on original. But for plugins - you need to "fork" original plugin and make your own custom version. Then on each update you should "diff" your version and original to keep your patches and new code. This mean very strong backup solution plus local dev environment and extra work on each update. Also - it's year of 2016. Why you don't around for alternatives? I can give you suggestion - static site generators: https://www.smashingmagazine.com/2015/11/modern-static-website-generators-next-big-thing/ https://www.smashingmagazine.com/2015/11/static-website-generators-jekyll-middleman-roots-hugo-review/ As you can see i'm not giving answer Yes or No. I'm just giving you few extra points to think about. Just put cards on the table. PS: May sound negative little bit because i have some theme and plugins in past. One wrong choice and all SEO efforts can be ruined. Such is life... Glossary: "fork" - process of creating different version of something existing with some changes that doesn't exist in original. You can hear that devs are forking projects too. In your case - since you can apply your patches but on next update everything will be gone back to original. That's why you need to fork them. "diff" - process for checking difference between files/project and extracting/showing only difference between them.
| Mobilio0