Among SEO professionals, there isn’t always consensus on precisely which and to what degree site factors contribute or detract from rankings on Google because the factors actually vary by industry. There are indeed, a number of contentious issues: markup and content quality, use of title tags, site organization and even arguments that Google Analytics data factors in to site rankings. Not likely (yet), but certainly up for debate among SEO professionals.
However, there are some Google ranking factors that most professionals agree affect site positioning on Google SERPs. However, these are opinions, find out for yourself how these apply to projects you’re working on.
Recommended Steps to Improve Google Ranking
1. Use keywords in HTML title tags. Probably the most significant factor for a site regardless of the competitive landscape, the title tag must be consistent with content in the page for best results. The more keywords in your title, the less effective this factor, be judicious.
2. Create quality anchor text for inbound links. At one time, according to some SEO professionals, quality anchor text was an essential component of a well-ranked site. After all, this is the text the user opted to see by clicking a link on another site. Most SEOs still contend that quality anchor text is a highly significant, positive ranking factor. If not for spiders, for visitors clicking in as well. Obviously the text should be relevant to the destination page for best results; that’s where your on page optimization comes in to play.
3. Increase link popularity. Link popularity takes into account the number of inbound links present. Link authority has less relevance, though it is still a factor depending on the competitive landscape. Link popularity is based on a global count of links from all sites. However, quality links are still critical to creating site authority; authority means ranking for more phrases than you intentionally target.
4. Hang in there. The age of a site is an important positive weighting factor according to many SEO professionals. It’s certainly a reasonable assumption. Failed sites are dropped as soon as the hosting subscription ends. If a site has been around for 10 years, the owners must be dong something right, especially if link popularity is steady developed over the years. Unfortunately for site owners, there’s no way to speed up the aging process – except hanging in there.
5. Increase the popularity of internal links. These links direct visitors to helpful, related content. They’re important in providing visitors with a positive on-site experience. Search engines view on-site link popularity as a sign that visitors like what they see and want to learn more.
7. Build deep links. Deep links are relevant to the topicality of the target page or keyword. The relevance of these inbound links matters to a site’s Google ranking. However, please note point 3. The sheer number of inbound links is a factor as well. Quality deep links carry more weight and add credibility to a site.
8. Connect with sites selling to the same demographic. Create a number of links with sites within your topical community. This helps visitors further their searches – something Google likes very much.
9. Keep old links. Google looks for web stability. The older the link, the more trust it has. It indicates a happy relationship with the site owner linking in who recognizes the value of sending visitors off-site. Google watchers suggest a three to four month time window for spiders to determine that this is a well-established, long-term link that has value to visitors of both sites.
10. Use keywords in body text. Make sure that keywords receive prominent display in headlines, headers, sub-heads. It’s important that the keywords used in HTML text on page match with keywords used in the site’s meta data and title tags.
1. Don’t use session IDs in URLs. It sounds like a good idea on the surface, an easy way to track customer information, but here’s the problem. Each time a spider crawls the site, a new URL with session ID is created. The spider now has two, or three or more URLs all showing duplicate content. Go back to Go, do not collect $200. Don’t confuse this with pages that may have a couple GET variables in them; avoid that when you can, but just avoid having your pages containing session IDs.
2. Choose a reputable web host. The most potent negative ranking factor is server accessibility. If your server, located in Timbuktu, is inaccessible to spiders, it’s inaccessible to visitors. Down time soon becomes down and out time.
3. Avoid duplicate content. Googlebots employ filters to detect duplicate content. Now, if you opt to post some syndicated articles, you’re providing a service to visitors. However, a bot will recognize that content (it’s already appeared on 400 sites) and you’ll see a drop in traffic rank.
4. Jettison low-quality links. Google assesses the character of your site by the company you keep so keep good company by unlinking from (1) links farms, (2) sites with absolutely no quality content and (3) otherwise low-quality sites; e.g. FFA (free for all) sites.
5. Avoid any kind of links deception. Googlebots aren’t smart, but they can detect some paid links and a variety of links scams, including generated links. If a Googlebot suspects links fraud, your site may be penalized and sent to the basement or banned altogether.
6. Avoid a log-in before visitors and bots access “the good stuff.” Log-ins can easily confuse a bot who won’t be able to access quality content hidden behind a log in. Even though users with Google toolbars will be unknowingly suggesting new URLs to be crawled as they surf about, having teasers for the content your monetizing by subscription will help your SEO.
7. Avoid using frames. Horizontal and vertical framesets are commonly used by designers to present more than one page of a site on the screen at the same time. However, frames are also bot traps. They can get in but they can’t get out, making it impossible for them to index a site – at all! Tell your developer to look at using iframes if possible or absolutely necessary.
8. Avoid duplicate title/meta tags. Title/meta tags are a valuable resource for site owners to expand access points to a site. Using title tags ensures that more pages are indexed and listed in Google’s SERPs as distinct links. All good. Unfortunately, too many duplicate title tags on pages in which the content topic hasn’t changed, is redundant and a waste of the bots time. Use tag your pages uniquely and judiciously.
9. Do not keyword stuff. Even though search engines no longer give much weight to keyword tags, keyword stuffing continues. Select 20 to 30 keywords – top-tier and long-tail – and focus on them. Keep keyword density in body text at no more than 3%. The old 5% rule still led to on-site gibberish – obviously these figures vary by competitive landscape.
10. Do not let quality slip – even for a day. Spiders crawl sites with greater frequency and sophistication and index updates are common as changes to a site are implemented. During periods of construction, be sure to keep spiders out of staging areas that have yet to be completed or block with robots. These works-in-progress may cost you points in the ranking sweepstakes.
Google controls 46% of all searches. Doesn’t it make sense to give this search engine exactly what it wants and delete what it doesn’t want?