Monday, April 30, 2012

Negative SEO a New Emerging Trend


Negative SEO has become new talk around the town. Will companies wage a bitter war against competitors for coveted no.1 position in Google? Is this become acid test to Google?

So what is Negative SEO?

Implementing techniques set as “spam” and “over optimization” by Google against competitors to push ranking down of competitor’s website. Now that Google introduced penalty against unnatural link building one can understand how this be used against competitors.

This is particularly true of Dan Thies’ case. When Dan openly praised Google for going strong against blog networks and spam links some people who are upset about new update set Dan as symbolic target. Some one set 1 million scrapebox links. The result: Dan got warnings and lost some rankings.

As the cliché saying goes, everything is fair in love and war. SEOs who believe in results and not ethics can use negative seo as a brutal weapon to overthrow competitors.

How negative SEO can use as throttle?
  • Building lot of anchor texts in quick time.
  • Building links from irrelevant websites.
  • Building too many no follow tag links.
  • Copying the competitor content and publishing same in authority website.
  • Not may all of the above work. This is just a wild guess and my personal thoughts and I don’t have any evidence apart from Dan Thies’ case.


Is SEO Community divided?

For the first time, SEO community looks divided. It’s I SEO versus you SEO. My SEO Company against your SEO Company.Until now people were following unethical practices to achieve rankings. Now people are affecting others website directly. Yes, previously too websites were affected but a position or two. Now the magnitude of consequence is unthinkable. Penalization may push you too low than you expect.

Did Google made a wrong move?

Without any iota of doubt, Google action is plausible. However, Google has little answer to help small business victims. How Google can judge one who submits website for reconsideration as a victim? Even Google finds this mess. Will this be considered as human intervention?

Sure, Google will be already in action to overcome this loophole. One more update may come. It takes time. Let’s wait.

Call us for FREE SEO audit 080-42111388 or email us info@yourseoservices.com

Thursday, April 26, 2012

Google Announces Web Spam Algorithm Update


Google send shivers down spine of Search engine optimizers who follow spam techniques. From the day 1, the Google is trying to protect SERP from being manipulated. The war is on not against SEO but Spammers who deliberately choose the techniques that are against Google guidelines.

The results of Google are improving by the year. This time too, like Panda update, algorithm affect tons of websites nearly about 3% of search queries show improved results. The penalization of web spam is aggressive in 2011, pushing keyword stuffed and unnatural linked websites to the new level of abyss.

Anchor text link play a critical role in the ranking of the website, where directories and bookmarking are easily available. Keyword plays a key role for website to be found. It is the title tag, footer links, meta description and below fold of the website are hot spots to place the keywords.

It is no secret that a website requires links and keyword rich content to be in top. While white hat seo dedicated in creating content and often used blog as platform, which help for readers and Google. Black hats SEOs used more of satisfying the technical guidelines of Google and forget what people want. They find out the loopholes of Google and try to maximize on it. This directly affected Google results.

Google has given name for this update as webspam algorithm update. It is strictly going after:

Keyword Stuffing


High usage of keywords starts from meaningless title tag and description, headings and ends with footer links. A visitor lands to website searching for information from Google finds only keywords all over the webpage. Some of them are in bold, italic or underline. Some of them start with the first word of the key phrase; the first two keywords of long phrase; or only specific key phrase and nothing else is totally garbage. Google is coming soon to hit these websites or has already slapped with update.

Duplicate Content


Two ways website can get involved with content duplication. One, stealing & publishing or copy & paste whatever same content in two or more websites is totally unacceptable to Google. Second, same content, in all the inner pages, changing only keywords in headings, title and meta tags and minute changes in body of the content. The latter affects really hard. As it is well intentioned content duplication exclusively for Google.

Link Spam


One who follows the back link automation services, Stone Age link building like blog commenting and signature links with anchor text. Automatically finding blogs and commenting same content by bots. These are come under link spam.

 From the word go, in its blog, Google said they love SEO and this is not against “over optimization” as there is nothing a word like that. This is against Web Spam.

Call us for FREE SEO audit 080-42111388 or email us info@yourseoservices.com

Wednesday, April 25, 2012

The Evolution of Panda and its After Effects


When Panda was introduced to the United States of America on February 24th 2011, majority article directories and portal websites got it. Leave the list of winners and losers of panda update aside for a moment and look back the seo strategies followed by these websites in pre panda era.

Pre Panda Era

Demand sites and article directories are some of the websites where content was being updated one after another. More websites are being founded across the internet. With a number of websites in live, competition is intense among owners of the site over adsence share. And competition is furious between link builders over obtaining anchor text back links from resource box. 
At the time, user generated content production was high Google introduced a new web indexing system – Caffine, started to index new pages at an uncanny speed. But from the time of Caffine update, displaying relevant and quality content from blogs, news sites, forums, article directories became a challenge.

Post Panda Era

Today, most of the sky scraper websites seems not to be making a profit. Perhaps only sites like ehow, techrepublic and intiatimes have been able to keep its head above the rest. So what’s wrong with the panda hit websites or even the authoritive sites as business.com? Why is the panda not affecting the website like techrepublic? The SEOs, speaking in one voice on this, that it is the lack of quality content, bad user experience and duplicate content. 
See the other end of the bridge lies ezine articles, business.com and mahalo.com. For ezine articles, having duplicate content became a bane. Websites like business.com has more to do with big ads on the above fold and poor listing of companies.
 The owners of these kinds of websites have taken stringent action to duplicate content malice, against the submitters. The recent changes in homepage had assured me that their over importance on banner ads and thin content be cleared over a period of time.

Call us for FREE SEO audit 080-42111388 or email us info@yourseoservices.com

Tuesday, April 24, 2012

How to Local SEO for Global Websites without Physical Mailing Address


One of the challenges in local search engine optimization is positioning ranks for website not having the local physical mailing address. Look at the Google SERP results carefully for your local keywords: 6 to 7 Google map listings have acquired the rank in first page out of 10 positions. The benefit of submitting in Google maps and owning local TLD extension domain is undeniable.

However, a well optimized website and with high quality links global websites can rank no: 1 position in Google. So here is the roadmap to improve the rankings and to compete with local business.

Local Landing Page

Dedicate the landing page by location. For example segregate html file name by demographic: abc.com/preschool/Bangalore/Karnataka.html

Meta Tag Optimization

If file name is to importance, the title tag is a strong indicator what the page is about. Use wisely the title tag: Location + product or product + location: Bangalore Preschools or Preschools in Bangalore.
Overusing the location words falls largely on the keyword stuffing. The Header tag should strengthen the products and not location. For example Montessori schools in Bangalore, Reggio Emilia Schools in Bangalore.

Unique Local Content

If you are trying to add hundreds of landing pages to the websites and replacing the location or physical mailing address you are creating a panda. Don’t create a website just for the sake of getting local traffic by duplicating the content. For a month or two you drive traffic but go down drastically. Create the reviews and user generated content and write unique content about the city or town.

Local Links

Building local links take more time than usual. Contact local bloggers to contribute content to your blog and at the same time be a guest blogger for their blogs. Get links from local business directories and portal sites. Target the audience in facebook and twitter by demographic. 

Local search optimization for global business is time consuming. But aint impossible.

Call us for FREE SEO audit 080-42111388 or email us info@yourseoservices.com

Monday, April 23, 2012

Google Sending Warnings over Un Natural Link building through Google Webmaster Tools


Harsh penalty on paid directories


Google’s recent focus on penalizing paid links is both understandable and justified. The sponsored link building has been rampant over the years and is one of the main reasons to Google for the latest action. Finding relevant links for websites, which is time consuming, has been an especially challenging task. Besides paid links, the webmasters is relying heavily on reciprocal links. Paid directories, though still at large, have been slowly coming down.

Some of the SEO companies are generally reluctant to spend the time that link baiting programs require. Dependent as they are on quick results and to meet their deadlines, they prefer bulk paid listings.

Google showed the exit to Blog Networks


 The Google Spam team has slowly tackling the issue strengthening its technology to improve the quality of results. First it went hard against duplicate content publishing sites. Second it started de indexing the blog networks web pages. This specific measures aim at penalizing the blog networks to encourage quality content providers – the bloggers – to provide more quality content to visitors.

 The new step is not the universal remedy that it is claimed to be. On the contrary, it is a reversal of past Google algorithms that set to rankings: Anchor text links. Typically seo agencies signed up blog networks and article marketing companies to get anchor text links. Many article marketers, who have many article directory sites and network of blogs, published the same content which in return offered anchor text links. Caffine technology come across millions of low quality pages in the recent past that are both duplicate and un worth to visitors. A steep increase in low quality pages and unnatural links is something that needs to be avoided.

The message is clear: Write quality content and treat back links as thumbs up vote to your service, product and information published in the website. And buying votes from others is totally undemocratic.

Call us for FREE SEO audit +91 8494983557 or email us info at yourseoservices.com

Wednesday, April 18, 2012

A Small Tale Of Latest Google Algorithm – SEO Over - Optimization Penalty


A Small Tale Of Latest Google Algorithm – SEO Over - Optimization Penalty

The Google Algorithm ghost is back to haunt the search engine optimizers. This time it is over seo optimization that Matcutts has advocated to seo professional attendees at SMXW. Mat Cutts, the official Google Webspam team Head, has raised the issue that the webmasters follow the optimization process of too great extent and achieve rankings with less quality content. In coming months Google can able to find and punish “ over optimized websites.” This is not the first time that Google is after search engine optimizers for the penalizing reasons. For the past 12 months, article directories and scraper websites were forced to take stringent actions against poor quantity content creators. The concern was that small percentage of optimized website could take control of the google first page even without providing relevant content whereas websites which are offering great content but unaware of SEO missing huge traffic from search engine.
 The Google then introduced Panda an update on duplicate content from websites; it disallowed the google bot to enter to those web pages frequently, Google penalized WebPages and traffic went down drastically.
Let’s get this straight. If the concern is over websites that are not following SEO guidelines but producing great content, it cannot be fixed solely by the set of offpage or duplicate content algorithms. The over optimization websites in question may have cheap links deliberately built by competitors, or a link from website which once was a quality directory but now has turned into a junk yard because of other webmasters. Worse, if Google so wish, Competitors can even try malign the websites in some or the other way that are not great in advanced optimization.
Faced with such risks, the Google has two options. One, it can take only content and design into consideration to rank the websites. But that will be knee-jerk and is a sure will become more subjectivity. The second and more pragmatic option is to develop an online platform where webmasters can claim their ownership of website content instantly for free. And this can be done through Google webmaster tool and Google plus profile.

The Second and and more pragmatic option is to develop an online platform where webmasters can claim their ownership of website content instantly for free. And this can be done through Google webmaster tool and Google plus profile.

Call us for FREE SEO audit +91 8494983557 or email us info at yourseoservices.com

An Eulogy Of ORM – How to Push Negative Listing in Google


My Experience of Online Reputation Management

Three months ago, a mail reached to my inbox - a furious complaint by client failing not even to push the negative comment down for even one position. I devised a new plan. Last week things worked out. Immediately I emailed to client with proud. We successfully managed to push the negative page to 2nd page blah blah blah with Regards

High authorative links and fresh content are central to the well being of the ORM and its contribution to pushing the negative link down. If backlinks are poor, the website will not be attracted to Google, and those who build links give unique content and teach how to be interactive in social media, diverting their attention and dedication towards getting bulk links ends in getting junk directories and bookmarking. Additionally, without appropriate content, there is a limited guarantee of ORM results or expectation of popping up positive rankings.

Trying to promote just blogs, or engaging in all the social networking sites or opening new accounts of all web2.0 sites and publishing same content are often difficult to obtain ranking in Google first page; The futile efforts by executing above activities separately turned into disaster strategy with no results whatsoever.
Yet, any scientific evidence is not available supporting this perception, within a case study of my client and past experiences the above notion on ORM may work for you all.

Maintain vividness in targeting the online medium and platform. What we did here with this client is we focused only one blog platform. For example in this project we chose wordpress and frequently added the fresh content where as other blog platforms like blogspot and typead we used for building links to microsites and other important profiles that intended to push top page.

This strategy, surprisingly, found significant improvements in the Google. As a general strategy, microsites were first achieving positions, although quite some of them struck in third and fourth page for some weeks, Flickr and the facebook profiles took relatively some months to come on top. There were a few surprises. Twitter ranked comparatively low in positions. Google site page, on the other hand, has got great ranking and it was absolutely unmoved till now.

In many companies, content creation is too low to support link builders; the fact is content writer is needed. The first three months, in many of our websites, content was common. We published unique content on those indexed profiles. After profiles got cached from Google results started to fluctuate.

A number of factors result in pushing the negative listing. In many instances, content updating is the main factor. In certain instances quality links is needed, in addition to the content, to achieve the results. In flickr, a simple commenting: interactions with 5 new flickr users per day and photo sharing: one upload per day are sufficient to be in top place.

This project shows a range of realities for the online reputation management. Choose 10 unique web2.0 sites to rank in first page. It may be a blog, video sharing site, photo sharing site, micro site, Wikipedia, social networking site or micro blogging site. But do not focus to promote same web2.0 websites like blogspot and wordpress or twitter & jaiku or linkedin & nyamz. Choose unique web2.0 sites and use rest website to build links for all the 10 sites.

Call us for FREE SEO audit +91 8494983557 or email us info at yourseoservices.com
Flying Twitter Bird Widget By Trickstoo.com