Improving Search Engine Rankings
|an BUDDING Engineer must be|
- Search Engine Theory
- Designing for Search Effectiveness
- Content Recommendations
- Technical Recommendations
- Site Linking & Popularity
- Other Factors
How can this subject be eight times as interesting as the most searched for woman on the planet? And at least three times as interesting as the most notorious name brand drug in history? Simple... it isn't.
With the exponential growth of information available on the World Wide Web, users increasingly turn to search engines to help them navigate through a sea of content in order to find the information they are looking for. At the same time, content providers each want to be the ones serving up the morsels of data in the hopes of luring eyeballs to their site to entice visitors with advertisements and other revenue generating opportunities.
The sheer volume of searchers and content providers equates to a digital age "gold rush" that dwarfs any opportunity in the history of mankind. Unfortunately, whenever there is a feeding frenzy of this magnitude, there are opportunistic individuals seeking to profit from the ignorance and naivety of others.
This document aims to equip Web authors with the information necessary to adequately prepare their Web pages for search engine placement in accordance with the most basic rules of the World Wide Web. Absolutely no promise is made as to the end result of the use of this information; however, HTMLHelp.com has effectively demonstrated that the techniques outlined within are effective and long lived.
The partners of the Web Design Group urge caution when dealing with anyone claiming to be able to guarantee monumental increases in search engine placement by utilizing any methods not outlined here, in this document.
That is not to say that sites cannot be optimized for better search engine rankings, nor should it be inferred that anyone offering assistance with search engine optimization is a crook. On the contrary, there are many techniques which will improve Web sites, and there are experts who will implement the necessary changes.
This document aims to outline everything one needs to know in order to do the job themselves. Should you choose to pay a professional, at least you should understand what they are doing and why.
Search engines exist to serve the needs of the masses. The internet is the most open of capitalistic systems in the sense that users award the best sites with their traffic. Search engines realize that end users want to find what they are looking for quickly. This means that search results must be accurate and useful.
At the same time, there exists a large subculture which views search engines as an endless supply of visitors just waiting to help them get rich quick. This subculture constantly attempts to find ways to jump to the top of the search engine rankings in order to rapidly monetize an investment.
Often times rogue sites will employ the use of highly deceptive means to attract visitors. Documented methods already include: Googlebombing, style sheet abuse, NOSCRIPT tag abuse, spamdexing, cloaking, link spam, doorway pages, link farms and Googleating. Search engines then retaliate by changing their algorithms to exclude such practices in order to render an otherwise even playing field.
This constant - and costly - battle ensures that any technique which falsely promotes a site to the top of the rankings will also topple the site as soon as the search algorithm is updated. As a rule the least-cost, highest performing and longest lasting results will be achieved by sites which are appropriately designed, content rich and technically proficient.
Conversely, a site with rich content but devoid of any aesthetic value may not meet all the needs of Web authors or end users despite otherwise fantastic search rankings. For these and many other reasons, Web authors must take special care when designing their sites to ensure that they will be accessible to all users - mechanical or human.
Since a thorough discussion of design could easily fill an encyclopedia, this document will simply provide a list of design considerations along with simple rationale for each.
- Offer a site map with links that point to important parts of the site. If the site map is larger than 100 or so links, break it into separate pages. These help search engines locate all of the content on a site.
- Make sure each page is reachable by at least one static link. If a search engine cannot find your document, it will never show up in a user inquiry.
- Keep URL's simple and static. Complicated URL's are difficult for people to type and hard to remember. Additionally, longevity is a factor in search ranking (more on this later...).
- Keep the site hierarchy fairly flat. That is, each page should only be one to three clicks away from the home page. This aids both humans and machines in navigating the site.
- Avoid the unnecessary use of frames as search engines often have difficulty indexing them correctly.
- Minimize the use of Macromedia flash as well as Java applets. Although they can add useful demonstrations and animations to a site, they are not indexed by search engines.
- Since dynamic page content is expected to change frequently, the relevance to search keywords will probably not be maintained. Moving content to static pages will improve indexing and lighten the load on the Web server.
- Make sure internal pages link to the homepage to aid navigation.
- Organize content by topic and divide the site into logical sections, each focusing on a given topic. This allows search engines to better target specific information relevant to keyword searches.
Design SummaryA Web site which is optimized for search engine effectiveness will be as easy to navigate as a good book. The home page should read like a table of contents, linking visitors to relevant information organized into useful sections. Sites which feature dynamically generated content such as forums or weblogs should still incorporate static HTML pages to aid search engines. Images, animations, scripts and videos should be used only when text is inappropriate and they add to the value of a given page.
Content is KingNew Web designers often fall prey to the misconception that sites need to be flashy, colorful and animated to attract and retain visitors. While the "user interface" is important, the content of the page is far more so. Those flashy Web pages often make the content more difficult to access and read, and they often suffer as a result in the search engine rankings. Besides, veteran Web surfers learn to ignore the look of a page and focus on the content. And eventually everyone will be a veteran.
It is probably easiest to illustrate this principle using real world examples. Take, for instance, the case of a young man getting a job and needing to wear a tie for the first time. Only problem is, he doesn't know how to tie one. So a simple search for "how to tie a tie" on Google returns the site http://www.tie-a-tie.net/. Go ahead and check it out... we'll wait.
Notice that the site has a clean interface. There is a clear menu listing on the left and content on the right. Graphics are only present to add information. And most importantly, the site actually demonstrates "How to tie a tie."
Try again. Suppose one needs to learn "how to plant a tree". Notice that almost all of the sites displayed at the top of the list will actually help you plant a tree. Additionally, most of the sites towards the top of the listing have minimal graphics and formatting and maximum tree planting instructions.
If you encounter a site which doesn't help you plant a tree, you get frustrated searching for the information and leave. Why would anyone trick a search engine into bringing them visitors who don't want to be there? It puts load on the Web server and ensures that people go away unhappy. But it's the same philosophy that e-mail spammers use: "If I get 100,000 people to look and only 1 buys, well that's one more sale than I would have otherwise had."
Since content is the main ingredient in search engine placement, thought should be given to the words users would type to find a specific topic. Each page should include appropriate words within it. Don't over-reach here. Most search engines are smart enough to find related terms such as "barber" for "stylist" as well as derivations of words such as "work / working" and even common misspellings. Cramming words into a page for the sole purpose of search engine optimization will harm rather than help rankings.
- Create a useful, information rich site and write pages that clearly and accurately describe your content.
- Try to use text instead of images to display important names, content or links. Search engines can't read images, and neither can people with visual disabilities.
- Choose topics which are original and unique.
- Limit pages to a reasonable size. If the content is lengthy offer a Table of Contents and divide the information into usable pages. This also allows each section to be more targeted by search engines.
- Exercise "Conservation of Words". Once you've gotten the message across, stop writing. Verbosity for the sake of increasing "keywords" will only drive real visitors away. And that's not good for traffic building!
- Proof-read, spell check and get peer reviewed. Every site can benefit from multiple opinions and multiple content edits. If your content isn't good enough to be published in the newspaper it won't be good enough to compete on the Web against millions of other pages.
- Make sure the TITLE element for your document is concise and accurate. The page TITLE is used by search engines to display link text as the result of a search.
- Ensure that each IMG element includes an ALT atribute.
- Provide links to interesting, related content when appropriate, but keep the links on a given page to a reasonable number.
- Always reference citations and sources. This indicates to search engines that the content is of research quality.
- Illegal content within a page will most likely result in that page's omission from search engines. Especially in certain countries.
Content SummaryPut more time and energy into developing content that will be useful to people, and less into worrying about it being "pretty" or where it is going to show up in search engines. You will get many times the return on your investment.
Man vs. MachineFirst and foremost, search engines are machines. They do not read Web pages the way most humans do. Furthermore, they are programmed with complex algorithms written by very smart programmers who are concerned with the way information is supposed to be presented. This means that the machines search for information in a manner that conforms to valid HTML and other Web programming standards.
This presents both a challenge and an opportunity to Web designers. The challenge is to ensure that each page written for the Web contains valid HTML which can be easily interpreted by not only search engines, but also Web browsers. The opportunity is to make content accessible to all users, world-wide, without bias towards any particular Web browser, computing platform or screen resolution. The greater the potential audience, the more visitors one will receive.
Frankly, HTMLHelp.com was founded a decade ago to promote these specific ideals and with the full realization that information is most valuable when presented in a manner accessible to all. Google had not even been invented when this site was designed and launched, but a survey of relevant keywords on all of the major search engines will demonstrate that building a site along the example set here will offer extremely favorable search results:
|Search Term||Google Results||Yahoo Results|
|CSS Help||1st Site Listed||2nd Site Listed|
|Cascading Style Sheets Help||1st Site Listed||1st Site Listed|
|CSS Guide||1st Site Listed||3rd Site Listed|
|Style Sheets help||1st Site Listed||7th Site Listed|
|Style Sheets||2nd Site Listed||2nd Site Listed|
|HTML Help||1st Site Listed||1st Site Listed|
|HTML Reference||1st Site Listed||3rd Site Listed|
|Web Authoring||2nd Site Listed||1st Site Listed|
|HTML 4||2nd Site Listed||2nd Site Listed|
|HTML||6th Site Listed||6th Site Listed|
- Always validate each Web page. Search engines may not recover from HTML errors in the same way that a browser does. Tools such as the HTMLHelp.com Validator will find problems and give explanations of how to fix them.
- Use a text browser such as Lynx to examine each site, because most search engine spiders see your site much as Lynx would. If the site is not navigable via Lynx, search engines will not appropriately index it.
- Avoid the use of WYSIWYG Web Editors such as Microsoft Frontpage. The HTML created by these tools is virtually never valid and will frequently render improperly across multiple platforms. There are many sites offering HTML tutorials if necessary.
- If a page is moved, set up the page's original URL to redirect to the new page using a permanent redirect (HTTP 301) or a temporary redirect (307).
- Provide Meta Elements (including kewords and description) that accurately describe the contents of a web page.
- Allow search bots to crawl sites without session IDs or arguments that track their path through the site.
- Don't use "&id=" as a parameter in URL's. Some search Engines don't index these pages.
- Routinely check for broken links. Services such as HTMLHelp.com's Link Valet will automate the process and render a report of broken links.
Technical SummaryIf you are going to go through all the trouble of compiling or authoring interesting and unique information to place on the Web, make certain that the manner in which you place it there technically conforms to the rules of HTML established by the W3C. Otherwise, why bother?
No Site is an IslandAfter investing the time and energy to build a site with rich content, accessible design and technical excellence, a Web author will want to ensure that the site is visited and used by those for whom it was intended. The first instinct one has is to seek as many links and as much traffic as possible. But that instinct can be potentially fatal if left unchecked.
Before delving into this topic further, an example might be in order. As was previously mentioned, HTMLHelp.com was founded a decade ago and since that time has become one of the highest ranked Web authoring sites on the Net. It should, however, be noted that the Web Design Group did not purchase advertisements or launch link request campaigns. On the contrary, the site has been linked to because of the quality of the content. This author would be willing to bet that the previous example of Tie-a-tie.net rose to the top in the same manner.
If a site has been constructed according to the principles laid out in this document, the first step is simply to submit the URL to the major search engines (Google, MSN, Yahoo) as well as the Open Directory Project. The combination of good design and relevant information will likely begin yielding some traffic very soon after submission.
One excellent source of traffic, and a good way to receive a permanent link from a reputable source, is to submit an article to a popular site within a related field of expertise. The article should include the author's contact information, including a URL link, near the bottom. Should it be accepted the link will exist for as long as the article is available on the site.
Incoming Links Make a DifferenceThe major search engines have publicly stated that one component of the ranking algorithm is the number, and importance, of sites linking to a particular page. Logically, if many related sites link to a given page that page is likely important. Looking back to the previous example of Tie-a-tie.net Google finds 143 sites linking to it.
There is also nothing wrong with politely asking visitors to link to your site. A simple statement at the bottom of an article such as "If you found this helpful, please consider linking to this page" may be all that is necessary. Or a dedicated "Link to this Site" page is a good option, and it has apparently worked for Tie-a-tie.net.
Be very wary however of anyone promising to link to a page only in return for a reciprocal link. A single link to a "bad link neighborhood" can plummet a site's search ranking, or even cause it to be removed from the index altogether. A "bad link neighborhood" would most easily be defined as: A page whose sole purpose is to exchange links for search engine rankings, regardless of the applicability to the topic of the page.
The importance of a given page also weighs heavily in the effect a link from that page makes. This could be compared to the Oprah "Book Club" effect. If one million average people recommend a book to their friends, it will sell a lot of copies. But if Oprah recommends the same book just once to her audience it will likely have the same or greater effect.
Suppose Microsoft.com provided a link to a given site. That one link would generate more traffic than many hundreds or thousands of links from smaller sites. Search engines take this into account and reward sites when receiving a direct URL link from a site with a high Page Rank. On Windows systems, the Google Toolbar browser add-in will display the Google PageRank of any page displayed within the browser window.
A final reported factor with a links effect is the age of the link in question. Some search engines reportedly give greater weight to a site's importance the longer a link remains in place.
- Search engines monitor the rate of acquisition of links to a site. Too many, too fast could indicate "unnatural" link buying activity and harm a site's rankings. This is another reason not to buy into any "buy a bunch of links" schemes.
- Search engines also monitor the rate of removal of incoming links to a site. If several sites begin simultaneously removing links it may indicate a user-affecting issue.
- Broken outgoing links which are not rectified promptly will indicate that a site is infrequently updated to search engines and can potentially harm rankings.
- Affiliations between a linking site and the linked site may be inspected by some search engines. If for example, they share an IP address or have a common postal address on the "contact us" page, the search engine will, at best, ignore the link.
- Be sure that sites you link to are relevant to the topic of your Web page and will be appreciated by your visitors.
Purchasing LinksThere is evidence to suggest that links can be purchased from sites with relevant content to increase search engine rankings. The same rules still apply to "for pay" links, namely:
- Purchasing from a "Bad Link Neighborhood" would only compound the stupidity. Paying to have a site penalized in rankings would be utter foolishness.
- The sites must have some relevance to one another.
- Ranking benefits have been known to increase in a matter of hours or days once links are in place.
- Ranking benefits have been known to decrease in a matter of hours or days upon the removal of links.
- Direct URL links purchased from highly ranked, relevant sites, often deliver significant "click through" traffic in addition to search ranking benefits.
- It is not recommended to purchase links from a page with more than 15-20 other off-site links. At some unknown point search engines begin to penalize for too many off-site links. Additionally, link "click through" traffic is diminished by the dilution effect.
Links SummaryLinks, paid or otherwise, are generally considered a good thing for search engine rankings; however, the link source must be relevant, constant and reputable.
Longevity is a PlusAnother trick often attempted by unscrupulous "Search Engine Optimizers" is to register a large number of domains and cross link them in order to drive up search engine results. These domains are essentially treated as disposable in that they only exist to increase the popularity of a featured domain.
Search engines use the age of a given domain name, the age of the content, and the length of time a link has existed to help detect this type of fraudulent activity. As search engines discover new "bad link neighborhoods" they nullify the effects of disposable domains.
For the legitimate Web author, all of this means that high quality content will be given more and more weight by search engines, the longer it remains on the Web. Additionally, the longer high quality links remain in place, the more weight they are given. The old adage "patience is a virtue" definitely comes to mind.
Miscellaneous Reported FactorsThe following is a list of factors believed by some to have an effect on search engine rankings, though the exact weighting of these elements is unknown or unproven.
- Metrics collected from other sources, such as monitoring how frequently users hit the back button when sent to a site.
- Metrics collected from sources like the Google Toolbar, Google AdWords/Adsense programs, etc.
- Metrics collected in data-sharing arrangements with third parties (like providers of statistical programs used to monitor site traffic).
- Hosting performance and uptime. Sites which are frequently unavailable or slow will rank lower.
- IP address of hosting service and the number/quality of other sites hosted on that IP.
- Hand ranking by humans of the most popular sites.
This list is compiled directly from the big three (Google, MSN, Yahoo) for convenience:
- Avoid using hidden text or links.
- Don't use pages that harm accuracy, diversity or relevance of search results.
- Do not load pages with irrelevant words in an attempt to increase a page's keyword density. This includes stuffing ALT attributes that users are unlikely to view.
- Don't create multiple pages, subdomains, or domains with substantially duplicate content. This includes automatically generated pages of little value.
- Avoid "doorway" pages created just for search engines or other "cookie cutter" approaches such as affiliate programs with little or no original content.
- Don't use computer programs to submit pages to search engines.
- Don't participate in link schemes designed to increase a site's ranking.
- Don't deceive your users or present different content to search engines than you display to users, which is commonly referred to as "cloaking."
- Don't use numerous, unnecessary virtual hostnames.
- Avoid misuse of competitor names.
- Do not employ pages that use excessive pop-ups, interfering with user navigation.
- Don't use pages that seem deceptive, fraudulent or provide a poor user experience.
- Do not abuse the META elements. Loading the "Description" and "Keyword" tags with non-related information will make a site appear as if it was "SEO'ed" and result in penalization.
Web Authors who implement the previously stated recommendations should find that search rankings will increase. Some items will cause a greater effect than others. As added benefits, the finished site should be readily accessible by all users, regardless of platform, and the hard work should pay dividends for years to come.
By focusing on the needs of the end user, designing to meet the standards of the Web and exercising the patience necessary to properly publicize and develop a site's traffic, any Web author can operate a popular and perpetual site. So get out there and pour your talents into creating something excellent! HTMLHelp.com will be there to help.