I wasn't implying that he should make a network of sites, i meant the links he gets should be better. - My bad.
Posts made by AndreVanKets
-
RE: Cross-linking domains dominate SERP?
-
RE: Cross-linking domains dominate SERP?
Just to add to that, These websites have shown you their strategy, instead of admiring them, replicate what they have done, but do it even better.
Original content like EGOL suggests, and even more relevant and stronger links, no doubt you'll be a strong competitor.
Greg
-
RE: Duplicate Content
Wordpress does this when you use tags....
Essentially the tags display the exact content as the original URL so the pages are identical but the URL is different.
2 Options that i can think of.
1.) Remove the tags and strip the category segment in the URL and stop using them in future. This will require redirects from duplicate URL"s to the main article (this will take planning, allot of time and is quite complicated)
2.) If you want the Tags and Categories for user experience, Install Yoast SEO plugin which allows you to insert a canonical URL on the duplicate category pages. This tells Google were the original page can be found. Tags are only their for user experience so you can set these to no-follow and no-index.
Greg
-
RE: How to rank for difficult terms
You could optimise the website for the location in UK that the business is situated in. This will give it some priority over other websites if you search in the same location as the business.
Do some research on Local SEO and make the most of it for your client.
Greg
-
RE: Redirection - Seo trick?
People buy expired domains and redirect them to their website to pass on the link juice and PR..
Its a grey hat technique which i don't have any experience with myself, but I do know that this is what people do.
There is a PR7 website that i found with only 1 link to it and I assume this was accomplished by redirecting authoritative domains to his new website.
My 2c
Greg
-
RE: Satellite Sites ?
Yes they can, Its just more expensive to host on different servers
-
RE: Web 2.0 seo
Web2.0 sites are the lowest value links you can get imo and a waste of time.
remember
site.wordpress.com and website.wordpress.com are both on the same website, that being Wordpress.
No matter how many links you get on these blog sites, they still only count as 1 linking root domain which doesnt help SEO very much.
Dont waste you time spinning articles to submit on web.20 sites (Senuke X blasts etc) Rather spend your time creating unique articles and publishing them on different REAL websites. This is the difference between "Building" links and "earning links" (which do you think Google prefers)?
Greg
-
RE: Will our PA be retained after URL updates?
Sorting out the canonicalisation was a good move, and generally url's with lower case characters are the best practices, but there is no negative SEO involved with letters in Caps in the URL. (lets hope the PA wasn't to high, as the new URL's are essentially new pages)
The link juice will flow via the redirect, but i dont think Open Site Explorer will follow the redirect as a link that contrbutes to PA.
Perhaps ask your SEO to change the URL's on links that have been built to these pages. This will definitely bring your PA for these pages back up. '
Greg
-
RE: Changing URL's for a website redesign
Will the URL be the only change on the page?
If so, in the case of redirecting www.website.com/page1.html to www.website.com/page1/ then a simple redirect from URL A to URL B will be fine.
You may see fluctuations in rankings for a while, but it should settle down close to were you were previously.
If you are changing keywords in the URL, then you might see more fluctuation in ranking.
Also, spend some time contacting webmasters that are linking to your old URL and ask them to point to the new one. Google passes link juice via redirects, but it would be worth your wile to get as many links pointing directly to your pages without having to redirect.
Greg
-
RE: Robots.txt issue - site resubmission needed?
I agree with Michael.
I have also seen a wordpress site that had blocked the robots from the entire site for 1 week.
After allowing the robots back in, we saw the rankings improve with in a few days.
Don't stress, just resubmit the sitemap or create a new one with the effected URL's
Greg
-
RE: De-indexed Link Directory
I would email each webmaster and save the corrospondance so that you can atleast prove to Google that you have tried to remove the links when you send your next Reconsideration request.
We wernt hit by any penalties, but after running the detox tool, we found a few sites that had been deinexed and got a decent response rate when asking for the link to be removed
If we ever get hit by a dodgy link penalty, we can at least provide some proof that we tried to clean up our link profile.
Greg
-
RE: Thinking about deindexing 200,000 pages
If the pages are there for user experience only, and you dont expect any of these pages to rank, I would block google bot from scanning the pages/categories you want to remove using the robots.txt, as well as making the headers on all these pages no-index (for good measure)
Once you have set this up, you can request the url's to be removed in your G-WMT account.
We had loads of booking pages and media directories indexed, and this is what we did to get them out of the index. Not sure how it improves ranks, but it definitely tells Google which pages are most important preventing Google resources being spent on pages that are not important.
Hope that helps?
Greg
-
RE: Do shady backlinks actually damage ranking?
Its all speculation at the moment, so the fear mongering is rampent.
In short, yes, bad backlinks can have a negative effect on your site, but it depends on how well established your site is. Negative SEO is real, but depending on the severity of the dodgy links, google just discredits them and asks you to get rid of them, rather than slapping you with a penalty.
When you say, you are afraid of "going out there" what is is that you are afraid of? Just don't be manipulative, get links on relevant websites and you have nothing to worry about. Just forget about trying to manipulate rankings, build relationships with other webmasters, after all, 1 great link is better than 50 suspect / low value links that take hours to "build"
-
RE: Exact Match Domain + shorter permalink vs. longer permalink?
www.acupunctureintakeform.com/template/ is a much better idea.
There will be no benefit in terms of SEO for you to repeat the keyword in the URL
As for the future pages, you could always do the following
www.acupunctureintakeform.com/template/
www.acupunctureintakeform.com/template-with-diagrame/
www.acupunctureintakeform.com/template-with-pictures/
etc
Hope that helps.
Greg
-
RE: Can someone tell me what ∞% trending upward in my keyword report means?
The change % represents the difference in visibility for the keyword you have tracked in the report.
-
RE: DA/PA against PR
I have come across this many times.
Have a look at this one.
www.becitywise.com/ PR7 DA6!
I tend to trust Open Site Explorer much more when comparing the two.
Greg
-
RE: 301 redirect www.brandname.com to www.brandname-keyword.com
How old is the existing www.xxxx.com website? How many pages are indexed?
If its a new site, with only a few pages indexed, and you are thinking of rather using www.xxx-toys.com then it will be a fairly hassle free process.
When your dealing with many pages, it can be a bit more time consuming and complicated,
Greg
-
RE: Similar sites on same IP address
Having each on a unique IP isnt neccessary.
The only real benefit of having your sites on seperate Class IPs is to break the relationship between them. In your case, sorting the duplicate content issue should be enough to sort out the penalty.
The only downfall IMO would be the low link juice value from linking between all 3 sites
Greg.
-
RE: Caps in URL creating duplicate content
www.url.com/abc and www.url.com/ABC are two completely different pages according to Google
I would redirect any and all pages with capitals to the corresponding lower case URL's.
Dont worry about the link juice as it will pass over via the redirect. It will also be much better than having 2 identical pages competing with eachother (according to Google)
Greg
-
RE: Duplicate Page Content Report
Are all 2000 pages 404's?
Do they all have unique URL's?
If all 2000 are 404's then Mozbot would pick these up as dupe content as well as 404's but if its only reporting duplicates and not all are 404's then other pages would also be duplicates.
1.) Confirm all duiplicate pages are 404's
2.) Do a scan using xenu link sleuth and see which pages are linking to 404's.