In-house for my own sites and freelance SEO mainly to local businesses. I'm actually more interested in writing software so will be delving into SEO software next.
I couldn't tell from your profile, which are you? Nice to meet you Eyepaq.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
In-house for my own sites and freelance SEO mainly to local businesses. I'm actually more interested in writing software so will be delving into SEO software next.
I couldn't tell from your profile, which are you? Nice to meet you Eyepaq.
Sounds like you might have to jump into the code. Here is a post from their forums regarding usage for page titles.
Can you send your parameters as POST data? I think this might be preferred if you do not want to index a URL with the parameters tacked onto the end.
Hi Michelle,
Hopefully others will chime in on whether or not you need to worry about diluting page authority (I am not as familiar with how authority is determined and do not have an answer for you).
I do have a couple questions about how you will identify return visitors. Will you be using a cookie residing on their computer to identify them? Will they need to log in?
Perhaps a structure in which you your neutral home page is at the root and you provide a different URL to serve up content for for users that are logged in/accept cookies/or however else you will be identifying them:
www.flatworldknowledge.com - neutral home page
www.flatworldknowledge.com/students - content for returning students
www.flatworldknowledge.com/faculty - content for returning instructors
I'm not sure if you've noticed, but SEOmoz uses a similar method for PRO members. If you are logged in as a PRO member and type www.seomoz.org into the address bar and hit enter, you are sent to your PRO Dashboard page: http://www.seomoz.org/users/pro. Your site could work similarly for students and instructors. Using this type of method would address your issue with title tags and you may have an easier time producing the dynamic content from a technical standpoint as well.
Anyways, hope this helps!
You won't be able to delete your root domain. The same index page is resolving for two different URLs.
I think you would want to 301 redirect from www.ferringway.com/index.php to www.ferringway.com. This will keep your links in place and fix the duplicate content problem. You can read more about 301 redirecting here.
Take a spin through Fabio Ricotta's slide-deck from MozCon 2012. There are a number of E-Commerce examples for including products on other pages to increase conversions.
Hi Colin,
Take a look at this page on the Wordpress.org site. The link should take you right to the file permissions section.
Hope this helps!
Hi Nikos, here is a link to a previous discussion on exact match domains and redirection.
Hope this helps!
I don't know the best way to handle this. But maybe this would work.
Using an example site built for attorneys that provides search functionality where users can pare down an entire list of attorneys by searching with keywords such as tax, miami, divorce, etc...
Using your location based info, you can provide a separate URL for each location. The keywords or search parameters would be sent as a query string or post data to a page like the following:
Example:
Since you know which locations you are targeting, you can add the main URLs to your site map and possibly even include them as links on your main search page in a "popular searches" section.
Example:
Popular Searches
The main category would include all results for that state.
This is an idea anyway, hope it helps!
Hello Gagan, please note my suggestion is to use "noindex, follow", not "noindex, nofollow".
Your links will still be followed, it's just that your paginated pages will not be indexed which should eliminate problems with duplicate content.
You might even be more interested in using rel="next" and rel="prev" as described here.
Hi Gagan,
I generally view pagination being used when you have a large number of products that match a category, tag, or other identifying attribute. I'm not sure what others have done, but an approach I would use if I do want the category page indexed is as follows:
This will cause your primary category landing page (the first 10 results) to be indexed with your desired page title and description. Because it is a landing page for this "category", I would also include a summary about the category and other useful information for users.
Your other pagination pages will continue to have the same title and description, but they will not be indexed. This will eliminate the duplicate content issue.
Hopefully others will sound off with what has worked for them too.
Hope this helps!
Excellent news that you were able to get it to work.
I'd say yes your social media links have zeroed out because they used the old URL without the www. It's not a Google thing though and won't be fixed by re-indexing the site. However, my understanding is that Wordpress 301 redirects the links from the non-www URL to the www URL so your link juice should still flow to the new URLs and the shared links will still resolve to the right page. The only caveat is that the actual number for likes and shares in your social share buttons has reset to 0.
In my opinion, it's a good move (you're just missing the "eye-candy" for now) and this number will rise again as people begin to share using the new URLs.
What Igor has said is correct.
I do want to point out another reason for ALT tags and that is accessibility. The ALT text of an image is used by screen readers and other software to provide valuable information about images to users. In your case, the valuable information is that the button should be clicked to view inventory for an event.
If the ALT text wasn't populated, there would be nothing to indicate to the user what will happen when the link or button is clicked. I think Igor's suggestion to populate your ALT tags without something like, "Browse all 'even name' tickets", is appropriate.
Edit: just saw you have already checked with FileZilla. I've created a new htaccess file and pasted in what Wordpress provides as an example then uploaded the new htaccess file to fix this on other sites.
Hi Gary, I don't. But you might be able to find out on the Yoast site.
Glad to be of help :).
Hi Scott, yes, this should be correct. After updating this value, you may need to log back in to continue using the admin area. This is because you are likely logged in at the old sub-domain of simply onestopmuscle.co.uk rather than the new sub-domain of www.onestopmuscle.co.uk and with the new redirect, you have to log in again. It's kind of strange behavior I know :).
In the Wordpress admin area, under General Settings, there is a input for WordPress Address (URL). Here, you can set the URL you would like to use for your base URL (e.g. http://www.example.com/). If you use www. here, your pages should use the www. version.
Hope this helps!
Hi Gary! I would suggest reading more about configuring for robots here, here, and maybe here (note: this last link uses an older version of the plugin, but provides good information and a lot can be learned by reading the comments too).
Otherwise, if you are using the most recent version of the plugin, you can click on "Titles & Metas" and then on the "Taxonomies" tab. On this tab, if you check the box under the heading for "Tags" for "noindex, follow", it will set all of your Tag archives to noindex, follow.
This will get rid of duplicate content issues for tags, but realize that your "tag" archives will then not be indexed. Regardless of what you decided to do, I suggest reading more at the link above.
I hope this helps!
For a large site, a lot of errors will be duplicates or at least similar. Identify, and group them. Then fix the squeaky wheels (the errors with the most occurrences).
Have a ton of duplicate content?
99% chance it can be fixed with a good 301 redirect solution.
Inappropriate or non-existent keyword usage on-page?
Follow the easy fix suggestions from your campaign (learn more about on-page reports here). Often, your pages are dynamically created and by working with a good developer you can fix issues with missing page titles and descriptions easily and often with one-punch knockout success.
Overall, there is not going to be a quick and easy way to fix all of your issues, but you should be able to identify the areas where you might see the most success in the least amount of time.
Since your site uses Joomla as its CMS, this extension might be useful. I suggest reading through the comments/reviews to determine whether it will work for you.