32
SEO Guide by Chris Beasley © Jalic Inc. 2003-2007. All Rights Reserved. Forward— When I first launched websitepublisher.net in 2003 a key cornerstone of it was my guide to building a successful website, which I had written in late 2002. This guide contained a section on search engine optimization, but it was merely one portion of it. One thing that has been evident since that time is that this site is most often recommended as a source for learning SEO and yet there was never any central location on the site that really sat the reader down and provided them with a part of which articles to read. So, as I rewrite my guide this year, I decided to pull the part about SEO out and make it its own guide. This guide still, though, is in the same spirit as my guide to build a successful website. This guide teaches a fundamental methodology that creates a potentially slow but sure way to success and good rankings. There are no tricks for overnight success, as those often involve black hat techniques that can work quickly, but only temporarily. The ban hammer always falls and as a business man I'm more concerned with long term stable potential rather than making a quick buck and always chasing the next trick. I've been making websites since 1994, doing SEO since late 1999, and have been writing articles on it for major websites since 2001. Having that experience has given me the perspective necessary to deal with all the misinformation you may find elsewhere. I remember a time when Google didn't exist. I remember meta tag hype in the 90s. I remember dozens of theories people pushed that ended up being shown to be wrong. I've had great success with search engines, gained rankings that were directly the cause of hundreds of thousands of dollars in revenue, and I've lost such lucrative rankings in surprise updates (thankfully, there were always more to be had). So why should you use this guide instead of one of the others out there? Because I have experience, the experience to know when a theory is bogus, or irrelevant. The conservative skepticism to not endorse something that is unsubstantiated. This guide is nothing but rock solid fundamentals, tried & true methods, that I have personally found success with. And oh ya, this guide is free. I'm a business man, I work for myself. I do maybe one or two jobs a year for other people. This means that I'm not trying to sell you something. You may find others who more readily embrace the theory du jour, because they want to show their clients they're on the cutting edge or they want to sell more books or more tickets. I don't have a big SEO consulting business, I don't have a book to sell you, and I won't try to sign you up to a seminar. I have no motive to scare monger you into an upsell. All I care about is presenting accurate information. If I can't prove it, I don't publish it or I only publish it with a caveat. I also have a very good track record of being right on such issues. Many of my earliest SEO articles written in 2001, 2002, and earlier 2003 dealt with myths & misconceptions and I often took flak for them at the time, but everything within them was later shown to be accurate and now is accepted as truth by most SEO professionals. This free guide contains over 20,000 words and would be over 60 pages long if printed, and that just includes the core 6 sections. There are dozens more pages of supplemental reading linked to from here and from within the guide itself. The only payment I ask for this guide is that if you like it, you recommend it to others. Additionally if it has helped you I would appreciate an email or a forum post testimonial. Chris Beasley June 2007

SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

SEO Guide by Chris Beasley© Jalic Inc. 2003-2007. All Rights Reserved.

Forward—

When I first launched websitepublisher.net in 2003 a key cornerstone of it was my guide to building a successful website, which I had written in late 2002. This guide contained a section on search engine optimization, but it was merely one portion of it.

One thing that has been evident since that time is that this site is most often recommended as a source for learning SEO and yet there was never any central location on the site that really sat the reader down and provided them with a part of which articles to read.

So, as I rewrite my guide this year, I decided to pull the part about SEO out and make it its own guide. This guide still, though, is in the same spirit as my guide to build a successful website. This guide teaches a fundamental methodology that creates a potentially slow but sure way to success and good rankings. There are no tricks for overnight success, as those often involve black hat techniques that can work quickly, but only temporarily. The ban hammer always falls and as a business man I'm more concerned with long term stable potential rather than making a quick buck and always chasing the next trick.

I've been making websites since 1994, doing SEO since late 1999, and have been writing articles on it for major websites since 2001. Having that experience has given me the perspective necessary to deal with all the misinformation you may find elsewhere. I remember a time when Google didn't exist. I remember meta tag hype in the 90s. I remember dozens of theories people pushed that ended up being shown to be wrong. I've had great success with search engines, gained rankings that were directly the cause of hundreds of thousands of dollars in revenue, and I've lost such lucrative rankings in surprise updates (thankfully, there were always more to be had).

So why should you use this guide instead of one of the others out there? Because I have experience, the experience to know when a theory is bogus, or irrelevant. The conservative skepticism to not endorse something that is unsubstantiated. This guide is nothing but rock solid fundamentals, tried & true methods, that I have personally found success with. And oh ya, this guide is free.

I'm a business man, I work for myself. I do maybe one or two jobs a year for other people. This means that I'm not trying to sell you something. You may find others who more readily embrace the theory du jour, because they want to show their clients they're on the cutting edge or they want to sell more books or more tickets. I don't have a big SEO consulting business, I don't have a book to sell you, and I won't try to sign you up to a seminar. I have no motive to scare monger you into an upsell. All I care about is presenting accurate information. If I can't prove it, I don't publish it or I only publish it with a caveat.

I also have a very good track record of being right on such issues. Many of my earliest SEO articles written in 2001, 2002, and earlier 2003 dealt with myths & misconceptions and I often took flak for them at the time, but everything within them was later shown to be accurate and now is accepted as truth by most SEO professionals.

This free guide contains over 20,000 words and would be over 60 pages long if printed, and that just includes the core 6 sections. There are dozens more pages of supplemental reading linked to from here and from within the guide itself.

The only payment I ask for this guide is that if you like it, you recommend it to others. Additionally if it has helped you I would appreciate an email or a forum post testimonial.

Chris Beasley June 2007

Page 2: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

Search Engine Optimization Fundamentals

Search Engine Optimization Philosophy

I teach a fundamentalist search engine optimization philosophy. I have been actively optimizing sites now since 1999, which in the grand scheme of things isn't all that long, but in the realm of the Internet that is a good deal of time, and certainly longer than most.

In this time I've achieved many great rankings, I've had rankings suddenly vanish, and I've been witness to countless theories that end up being wrong, all of this has formed my opinion that fundamental SEO is the way to go.

Search engine optimization is a shady industry. There are many clueless individuals trying to make a buck through passing themselves off as an expert. Even worse, there are very many people who honestly think they are experts but who have only been in the industry a short amount of time. Finally, there are those people who do not come from backgrounds that emphasize logic, critical thinking, and scientific observation. (I talk more about this here. Take all that, and add in a healthy dose of Internet anonymity, and you end up with a lot of bad information out there.

Compounding the problem is that search engines are not in a constant state, they are extremely complex and frequently changing. So even if you manage to navigate your way through the mess of misinformation that is search engine optimization by the time you find something valuable and helpful it may be outdated either already or in short time.

The answer to this is to ignore the theory du jour and instead focus on the fundamentals. Anything beyond fundamental search engine optimization, as I define it anyways, is in a shady gray area and the search engines actively work to combat those methods as they are often used by spammers. So, the only truly safe method (and by safe I mean stable, you're unlikely to get a huge drop in ranking in a future update) is to use fundamentals.

The rest of this article is meant to be a guide to the fundamental approach to search engine optimization.

Search Engine Jargon

Before we get into in the meat of search engine optimization it is important that everyone is on the same page. Some readers of this article may be entirely new to this industry and so some definitions are in order.

There is often significant confusion as to what exactly is a search engine if you talk to anyone outside of the search engine optimization field. For instance many people define a search engine as a website where you can perform a search, this is not accurate. The correct definition of a search engine is a service that creates an index of the World Wide Web through automated means. A site that simply offers a search service, and doesn't manage its own index, is simply a search portal. There are also pay-per-click (PPC) search engines, which are really more similar to advertising networks only that the ads that they buy and sell are search listings, not graphical ads. Then there are directories, which do the same thing as a search engine but do it all manually and group the listings into topical categories. Finally there are meta search engines, which are search portals in that they do not run their own index, but they do offer unique results because they generate their results from multiple indexes in a sort of compilation.

Page 3: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

To offer some examples, AOL.com is a search portal, they do not run their own index. Google, Yahoo, MSN (Live.com), and Ask.com are all search engines, they have their own index. SearchFeed.com is a PPC engine, it only has paid listings. DMOZ.org is a directory, their listings are all done by hand. Dogpile.com is a meta search engine.

Now Yahoo has a directory too, and Google rebrands DMOZ data on their site. Additionally both have PPC results, but these are all extra, they are fundamentally search engines.

How Search Engines Work

Each search engine works a little differently but most have general characteristics in common. Unlike a directory, which you have to manually submit to and rely on a listing made by a human editor to determine your ranking, a search engine uses your entire page, sometimes your entire site, to determine a ranking. Also unlike a directory you don't need to submit to a search engine to be included in them, although submitting may expedite the process, because search engines send out robots (also known as spiders) that crawl the Internet following links from one site to another. So the only thing you actually need to be listed in search engines is a link to your site from a site already in the engines.

Each search engine will have a different way to submit. With some it's a simple form, with others you have to pay to submit. Regardless of the submission process you only need to submit your root URL to each engine (assuming that internal links can lead a robot to all your pages). You also do not need to resubmit any time in the future, there is no need to repeatedly submit your site in order to maintain your ranking. Search engines periodically revisit every site in their index so your listing will stay up to date anyways, and resubmitting won't get your listing updated any sooner.

Once you've submitted you should watch your site statistic logs, you should notice spiders visiting shortly, how long it takes could be a few days to a month or more depending on where the search engine is in it's update cycle. To identify the spiders you will need to look at their HTTP_USER_AGENT, also known as their web browser. For a full listing of robot user agents see our article "Search Engine Robots."

Once you notice the spider in your logs it can take additional time for you to actually be listed on the search engines. Not all engines update their listings as soon as new ones are available, some will wait up to a month or more before publishing the new listings so that they can update all sites at once.

Getting into search engines can often be as timely a process as getting into directories, even though search engines use robots that automate the process. You should be patient and refrain from resubmitting your site. The only time you should resubmit your site is if you notice a search engine spider in your logs but 6 weeks go by without your site appearing in their index, or if your site gets dropped from the index. At the same time don't worry about over submitting, search engines generally don't mind that anymore.

Search Engine Optimization

Search engine optimization is the practice of tweaking a page so that it is ranked well under target keywords in the organic, or natural, search results (organic results mean not paid placement). It differs from search engine marketing, which is the practice of tweaking Pay-Per-Click campaigns for optimum ROI (return on investment).

Page 4: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

In the earliest days of the practice all a webmaster had to do was repeat a keyword on their page over and over again and the page would rank well for that keyword. This was of course easily abused and search engines had very poor relevancy. Then Sergey Brin and Lawrence Page founded Google based off an algorithm that used both on-page and off page factors and the search engine world changed forever. Now every major search engine uses off-page factors that Brin & Page pioneered, so while they are all different, when you're focusing on the fundamentals the changes you make will be beneficial for all search engines.

How Search Engines Rank Pages

As I discussed above, today search engines rank pages using a combination of off-page and on-page factors. Additionally there are two types of factors, quality modifiers and topic modifiers.

On-page factors are anything that is on your page, not just what is visible, but anything in the source code search engines can see. To see what search engines see you can browse to your page and if using Internet Explorer go View>Source or, if using Mozilla Firefox, go View>Page Source. This is what the search engines see.

Off-page factors are primarily links and the context associated with them. Remember that I'm saying off-page, and not off-site. Even though you look at your site as one whole, search engines are much more interested in your pages as individuals. Think of links as votes, a link from one page to another is a vote from one page to another. The more votes you have, the better. Additionally the anchor text, the text used within the link, is part of the equation in that it indicates what the vote is for. The text

around the link, the topic of the page in general, and every who links to the page that links to you, all of these can modify how much benefit you will get from each link.

Warning

Some people get confused and think that if search engines see your source code then they must be seeing the source code of your dynamically programmed pages too. This is not the case. All search engines see is what you see when you use the View>Source menu option. If your site is dynamically programmed this would be content your program or script prints out, so it doesn't actually see the programming behind your script.

Quality modifiers are factors that actually make the difference between being ranked #20 and being ranked #1. They are factors which search engines take to mean that your site is good. The biggest quality factors are off-page ones as search engines no longer trust webmasters to rate their own quality. So incoming links are the main quality factor in search engine algorithms. Also, many spam filters and or penalties are for your overall quality, there are a lot of little things that can bring you down in this regard.

Topic modifiers are factors that help a search engine to decide which keywords to rank your page under (as opposed to how high or low to rank it). The main topic modifiers are on-page factors such as keyword usage on your page, your title, your headings, and other on-page factors that will be discussed in other articles. However, the off-page context of your incoming links is a very important topic modifier as well.

The point I want to make with these factors is that you need all of them in a good balance in order to be successful in your search engine optimization efforts. A million incoming links but poor on-page optimization and you'll not rank well. Perfect on-page optimization but no incoming links and you'll not rank well. Good on-page optimization, lots of incoming links, but bad context & anchor text for your incoming links and you'll not rank well. You need as many incoming links as you can get, but within a good context for your website, and you need good on-page optimization without breaking any search engine guidelines. Do all that and you will rank well.

Warning

Don't fall into the trap of thinking just because, in your opinion, a feature makes a website better must mean that it is part of a search engine's quality score. The fact is if any single feature makes a site better then that site, being better, will naturally accumulate more incoming links, which are already accounted for and a far more accurate measuring tool.

Page 5: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

Search Engine Optimization and Accessibility

Maybe people mistakenly believe search engines want your website to be accessible and perfectly coded. This is not true. The issue is that building an accessible or usable site and building an effectively optimized site are coinciding goals for the most part. Many of the same changes you would make to enhance the usability of your site also help you rank better in the search engines, they help in each way for different reasons but they do help in each way.

Then of course, as I mentioned above, anything that makes a site of a higher quality generally earns that site more incoming links, so building a usable site is certainly a good idea, just make sure you understand that there isn't any direct bonus for usability, it is easy to get carried away with it if you make that mistake.

Back to the Basics

Fundamental search engine optimization is mostly about building a good site, with common sense optimizations, and letting the cream naturally rise to the top. It is not a get rich (or get ranked high) quick method. Rather a slow and steady and proven method to success. Mostly this philosophy of search engine optimization simply ensures that you are not doing anything to hurt yourself, and then allows you to focus on developing your content and your website. Good solid quality content is really the most important thing. I firmly believe that it is more important to have a reliable search engine ranking, than to be bouncing all over and risk being tossed entirely.

I've only set the stage with this article, I've laid the framework for a solid understanding of search engine optimization, but there is much more to learn. Please continue your research through the articles below:

Off-Page Factors

All About Link Popularity & PageRank Link Building Methodologies Hub & Spoke: How to Cross-Link Your Websites

On-Page Factors

Choosing the Right Keywords On-Page Search Engine Optimization Techniques Site Architecture: Optimizing your Internal Links

Niche SEO Topics

3 Types of Optimization: Homepage, Category Page, Content Page Search Engine Friendly URLs for Dynamic Sites SEO for Ecommerce Sites

Page 6: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

Keyword Research: Making Effective Keyword Choices

An often overlooked, but vitally important, factor of search engine optimization is keyword research. All the optimization in the world will do you no good if you are optimizing for the wrong keywords. Very early in the planning process keyword research can even help you decide on a site title or domain name. How would you like it if you built a site around a certain domain only later to find out that had you chosen a different domain you could be getting 10x more traffic? Such a scenario is not impossible, so read on.

Keyword Popularity

The first aspect of keyword research you must undertake is to determine which keywords are more popular. There are two main tools that webmasters use in order to do this, the first is called Wordtracker which is a paid service, although they have a free trial that does enough for most of your needs. The second is Overture's (Yahoo Search Marketing) Keyword Suggestion Tool.

Wordtracker is primarily powered by statistics from Dogpile.com, a meta search engine with a small market share (but still a sample size of millions). Overture's results come from Yahoo, which is of course much larger.

Which service to use, and in fact you could use both, is a subject of personal opinion mostly, however I believe Wordtracker to be better. Wordtracker may have a smaller sample size, however when you're dealing with hundreds of millions of samples you're going to have statistically significant results regardless of markershare. Additionally Overture lumps singular and plurals of the same word together, a process called stemming, which isn't helpful at all. Finally, Overture's results can be skewed by software that repeatedly checks their system for current PPC bid prices.

Whichever system you end up using, it is important to note that these services cannot accurately predict traffic levels. They may give

estimates for daily or monthly searches, but those estimates are almost useless. Instead, these services are really only worthwhile when doing comparisons. For instance they cannot exactly predict traffic for any one keyword, but they can fairly accurately show proportional popularity among different keywords. So while you may not know how much traffic you will get, you will know which keyword has the best chance of providing the most traffic.

Diary

I once optimized for a singular form of a word and reached #1 in the search engines for it, I was quite happy. Then I did keyword research (which I should have done first) and discovered the plural forum of the word receives 5x more traffic, and so I could have been making 5x more than I was currently. I made the change and eventually got to #1 on both versions, but I ending up missing out on probably around $10,000 worth of income because I didn't do my keyword research first. Also, because of how Overture stems, had I done my research with them I would not have picked up on the problem. This is why I recommend Wordtracker.

Keyword Difficulty

The other half of your keyword research has to do with assessing keyword difficulty, this itself is a very complicated concept.

The concept of deciding if a keyword is difficult or not is really something that is hard to explain, it is almost just something you eventually pick up just by being in this industry. Many people will use a simple popularity metric, which means you would perform a search, and look at the size of the results returned. The problem with this is just because a site includes a keyword on their page doesn't not mean that they are actively optimizing for that keyword. Some words are just more common than other words and so you cannot use sheer volume as a measure of competitiveness.

Page 7: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

Another method is to simply look at the top ranking page and check their PageRank using the Google toolbar. This is the method I recommend for novices who cannot intuitive judge optimization quality when visiting a site. PageRank is not a perfect system, the toolbar you use to check it is not always accurate and it still does not indicate perfectly how much work the site is putting into its optimization. However, it will tell you roughly the total weight of the income links of the #1 site, and using that you can estimate how difficult your link building activities will be if you want to beat that site.

Realistically though once you become more acclimated to search engine optimization and what it all entails you'll want to check the top 5 or so sites and look at things like how professional they look, their incoming links, their PageRank, their use of on-page optimizations, and the age of their sites. You should also look out for technical hurdles they may not have addressed. Many larger sites or small business sites, even those by very large corporations, are often built without any regard to SEO and so while it may be intimidating to compete with them the fact is that often they have technical SEO roadblocks that are really hurting them and make them easier competition for you. These technical hurdles usually have to do with redirects, internal navigation, and session management.

Keyword Profitability

Another aspect to consider in your keyword research is how profitable having a site based around the various keywords can be. The fact is some keywords or site topics can pay vastly more than others. This concept was discussed in a previous article on finding high paying keywords.

Riding the Long Tail: Brute Force SEO

"The Long Tail" is a phrase that has come quite popular in the past year. In a nutshell it means that the overall popularity of the less popular entities combined is far greater than the popularity of any single blockbuster. The concept is mostly an issue for content distribution. In the past with traditional distribution channels your content needed to be broadly popular, a blockbuster, if you will. Now with digital distribution channels your content can more effectively focus on and serve a specific niche because you have a method of efficient delivery for that niche.

So how does this apply to SEO? Actually it applies almost directly and in fact this concept I first read about back in 1999 or 2000 in a newsletter, long before the term "long tail" was coined. You see, the main goal of SEO is not actually to rank well in search engines. The main goal is to provide traffic to your site through organic search results. It may seem like these two things are the same, but they aren't, not always anyways.

If you target the long tail of your search keywords, all the less popular less competitive keywords, you can achieve equal or greater traffic than if you just focused on the single most popular keyword. The reason is that while all those obscure keywords aren't that popular, there are lots of them.

More Info

I talk more about optimizing for your content pages instead of your homepage in this article on the three types of search engine optimization.

To effectively target so many different key phrases you will need many different pages of content, the more the better. This content deluge is what I call brute force SEO. By providing a large amount of content you're bound to rank well on some phrases and the more content you add, the more rankings you will have, the more traffic you will receive.

Of course, there is no reason why you cannot target the popular keyword with your homepage, and all the less popular keywords with your subpages. However, the concept I wish for you to take away from this section is that you do not always need to optimize your homepage. If you feel the main keywords are too competitive there is nothing wrong with focusing on subpage optimization and going after the long tail.

Page 8: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

The Keyword Effectiveness Index

Once you measure the popularity and difficulty of your keywords, you can make a visual aide to help in deciding which keywords to optimize for. This aide is called a keyword effectiveness index.

To construct a keyword effectiveness index you simply need to plot a chart with search volume on the x-axis and either Google PR of the first site, or total results returned, or another measure of perceived keyword difficulty on the y-axis. The x-axis represents benefit and the y-axis represents inherent difficulty to achieve the top spot. You want to optimize for words with low difficult and high benefit. Otherwise known as the path of least resistance.

The idea is to find a keyword that will result in a point that is as red as possible. You of course though know your own

limitations, if you have the skills and resources necessary to take on an extremely difficult keyword, then by all means go for it. This tool is merely meant to help you in your keyword research, in the end though it can only provide guidance, it cannot make your decisions for you.

Wrap-Up

I cannot express how important it is to conduct proper keyword research. I had to learn the hard way and I missed out on some serious income because I didn't know any better. You luckily have this article and this site and hopefully you will learn from my mistakes and do keyword research from the start.

Once you've thoroughly researched your keywords, you can start learning how to effectively use them on your site.

As always, if you have questions on anything in this article or need help you can visit our webmaster forums.

Page 9: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

All About Link Popularity and PageRank

Any link to your site can help build link popularity, so the number of ways to build link popularity is nearly beyond count. I'm only going to focus on the most common methods here, but if you can think of a new and innovative way to get people to link to you then by all means go for it. More methods will be discussed in detail in a further article on link building methodologies.

The most obvious way to get people to link to you is to ask them, however this method is often overlooked or purposely ignored by people who simply believe that it won't work. Simply asking people in fact works very well, you just need to do it right.

The first thing you need to do is locate your targets, who is it that you want to link to you? Well if you want to rank #1 then the people that link to the #1 site need to link to you right? To find out who links to the #1 site search on Google for your desired keywords and then using the toolbar check the backward links of the first site that comes up. These are the sites that you need to get to link to you. Now it may not be possible to get all of them, but get as many as you can.

Why would someone add a link you might wonder? Well if your site is well designed with plenty of useful content (and if it isn't then you need to go back and read the rest of this site where it is explained that quality matters), then why wouldn't someone want to link to you? Part of the value of a website is that it can lead you to the right information if you can't find it there. So having a list of links to high quality content rich sites adds value to the site with the links. You'd be surprised at how many people will add a link if you ask them nicely. It also doesn't hurt to mention how you think your site would be a good resource for their visitors. This is especially true with amateur sites, and believe it or not amateur sites are important. Sure these sites might not have the best designs but since they've been online for years they've accrued quite a few inbound links and a high link popularity rating. The people who run these sites aren't out to make money so they don't mind linking to you if you have a good site, they're noncombatants so to speak.

If you can't get a free link you might be able to get a link exchange, where you link to their site and they link to yours. This is more typical of for-profit sites. Here is where the Google Toolbar will again come in useful. You will be able to tell which pages have a higher PageRank so you know which ones will be worth it more to exchange links with. I don't want to suggest that every link doesn't count, because they do all count. Sometimes though you want to limit the number of links you have and this is one way to do that.

When sending out these link requests you need to be careful about spam. Contrary to popular belief spam is any unsolicited email. The message doesn't have to be commercial, it doesn't have to be bulk. Sending a single unsolicited message can be considered spam and if the recipient complains it may be enough for your host to suspend your account. So if there isn't an email address available on the site for comments, suggestions, proposals, or anything of that nature then you shouldn't just blindly email them. Some people really hate solicitations and it is better to be safe than sorry.

Understanding Link Popularity

Link Popularity algorithms are extremely complex and I will not be going into full detail on the nuanced bits of mathematics involved. This article instead aims to provide a solid basic understand of how link popularity works and parts are simplified for easier understanding.

The most important thing you must understand when working with link popularity is that every page is treated as a separate entity. What this means is that internal links, that is links between the pages of your site, do count. In fact search engines make no distinction between those links and links from other outside pages. This is something that many people get hung up on, they do not understand why search engines would count internal links. What they don't realize is that there is really no way to tell if a link is from an internal or external page, since some sites span multiple domains and multiple sites can exist on one domain. As such optimizing your own internal site links can be as important as gaining new links from other sites. For more on optimizing your internal site links see this article: Site Architecture: Optimizing your Internal Links.

Page 10: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

Another issue that some people get hung up on is that they think that reciprocal links can hurt you, meaning if you link to a page that links to you then you will be punished. This is not entirely true. Search Engines actively seek out artificial linking schemes and some link exchanges can be seen as such, however they usually just devalue the links rather than penalize the sites involved. So while link exchanges are good, they should be entered into with some trepidation, as will be discussed later.

Now link popularity is not a purely quantitative measurement. Instead the quality of links often counts more than the sheer number of links. Quality plays a role in two different ways. The first is that the context of your link is important. If you want to rank high on certain keywords then you need the pages that link to you to be related to those keywords. If the pages linking to you are not related to your site's keywords then they will do little to raise your ranking for those keywords. Also, more specifically, the text directly surrounding your link, or the text within the link itself, as well as any nearby headers or the page title can all influence what the search engine sees as the context of your link. If you read my article on choosing the right domain name you might remember that having a keyword rich domain name or site title can again help your rank. Since people will link to you using your domain name or site title, having keywords in them, and thus in your link text, can help with link popularity in this manner. In fact having people link to you using your keywords is extremely important and beneficial, and of course using keywords in your own internal links helps as well. For instance it is common to link to your home page using the word "Home." Don't do this if you can avoid it, instead use your keywords in the link. Sure it may look a little funny but the benefit will be immense. I have a page that once got ranked #1 on AltaVista (yes, I did just date myself) on a search that returned over 8 million results and it was ranked higher than some rather large corporate entities such as NASDAQ. The reason? I linked to this page on every of nearly 10,000 pages on my site using my desired keywords. The text inside the links that point to your site is extremely important and a big reason why you need keywords in your domain and site name. Even a link from a completely unrelated site will still help if your keywords are in the anchor text.

The second way quality matters is that pages that rank high themselves carry more influence than those with low ranks. So getting an inbound link from a very popular site can help more than one from an unpopular site. Also how much benefit you gain from a link is dependent on how many other links are on that page. So being the only link on a page will garner more benefit than being one of one hundred links. I often compare this to sharing a pizza. If you have a pizza and share it with 2 people, each person will get more than if you shared it with 10 people. Also if the pizza is larger to begin with everyone will get more. So you'll get a larger benefit from being linked to from a page with a high rank and few links than from a page with a low rank and many links.

This brings me to another important issue. Many people operate under the impression that being in link farms, or FFA (Free-For-All Link Sites), which are basically pages filled with hundreds or thousands or unrelated off topic links, can hurt their ranking or get them banned. This is not true for two reasons. The first is an ethical reason, since a webmaster has no control over who links to them, no inbound link will ever penalize you. The second reason is that it is simply unnecessary to penalize people for doing this. A link farm will not have much weight to begin with, and since the rank is diluted by such large numbers, the benefit gained from such a link will be almost unnoticeable.

So while getting a large amount of incoming links is important, the weight and context of these links are what will a make a difference in your ranking.

The final thing you must know about how link popularity works has to do with redirects. Depending on the type of redirection used search engines will not assign any rank to the resulting page. So if people link to you using a long complicated affiliate URL, or vice versa, those links will not count. For the most part link popularity is tied to the URL, so to build a high link popularity you need all the links pointing to this same URL, not any variation or mirror of it. If you want to maintain search engine rankings you should try using a "301 Redirect." This type of redirect sends a special HTTP header to the client that should tell the search engines to apply the weight to the resulting page. This is important to remember even if you do not use redirects as www.example.com, example.com, www.example.com/index.html, and example.com/index.html are all the same page, and yet are 4 different URLs. So be sure you link to your internal pages the same way throughout your entire site. Google Webmaster Central does include a tool to handle a portion of this (whether or not to use "www") for you automatically.

Page 11: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

Google and PageRank

Google uses link popularity more than any other engine, and it also provides the best tools for measuring your link popularity. Also Google is the most popular search engine in the world, and link popularity matters more than any other criteria when ranking high in it. As such I've decided to devote an entire section just on how Google uses link popularity. Since Google also pioneered the modern link popularity approach what you learn off Google can be applied with success to other engines that use link popularity (in all honesty most major search engines have more or less copied Googles ranking algorithm).

The first thing I need to mention is the Google Toolbar. Google offers a plugin for Internet Explorer that provides some various features. These include a bar which displays the current URL's page rank, as well as easy one click ability to check which sites link to the current page. I cannot recommend enough this toolbar, I consider it a must for any webmaster.

The first thing you will notice in the toolbar is a green bar that indicates a PageRank level. To find the specific numerical value associated with the green bar you can hover the mouse over it. PageRank is Google's version of link popularity and takes the weight and context of incoming links into account. The amount of PageRank needed for each individual level increases exponentially. It is much easier to go from a rank of 3 to 4 than to go from 5 to 6. In fact one or two good links could very well give a page a rank of 4, however you might need a thousand to get a 6. Its very hard to figure out exactly what the formula is, but research done by myself points to the base being a 4 or a 5. So each level would be 4 or 5 times as hard to reach as the one previous to it.

It is important to note that the value given in the toolbar is not always accurate. It does not show the actual PageRank for the page you are currently visiting, merely an approximation that can be up to 4 months old, so take it with a grain of salt. Additionally, if you are visiting a page that Google has not yet spidered, but they have spidered the root domain of the site, then they will guess a PageRank based on the distance from the root of the site to your current position. This is only a guess and has no bearing on ranking. Once Google spiders the page they will assign an actual PageRank to it. This guessing behavior is the reason why some Geocities pages sometimes seem to have high PageRanks, the fact is they've simply not been spidered yet.

The other thing the toolbar is specifically useful for is that it allows you to easily check backwards, or incoming, links. Directly to the right of the PageRank bar you should notice a "Page Info" button, if you click on this you should see an option for backward links. Selecting that will give you a select of the links Google has in it's public index. You will run into cases where a page has a rank in the PageRank bar but you can find no backward links. This can happen for two reasons. The first is that Google has not spidered the page. Sometimes when Google has not spidered a page the toolbar will guess a ranking based on the domain (assuming it has spidered the domain) this ranking is really just a guess and has no weight at all. This is why pages on free hosts such as Geocities seem to have a high a PageRank, when in reality they've just never been spidered.

The other reason a ranked page may show no backwards links is that Google does not put every page it spiders into it's public index. If a page itself does not have a very high PageRank, perhaps because it only has one inbound link or something of that nature, then while Google will note and count the outward links on that page the page itself will not be listed in the index. This is evident in the various popular directories like DMOZ or Yahoo. If you have a directory listing in an obscure subcategory then that listing will indeed raise your rank but the subcategory itself may not be listed in the index. If you're interested you can do more robust link research with Yahoo's Site Explorer.

Your Overall PageRank Power

Getting a good PageRank for your homepage is certainly a good goal, but it is not the most important value for your site. Your Overall PageRank Power, a term coined by yours truly, is much more important than the PR of any given page. Your Overall PageRank Power is the total PageRank a page would get if it were linked to from every page on your site, and only from your site. Finding your Overall PageRank Power is fairly simple. Chances are you have a disclaimer or a privacy policy or something of that nature linked to from every page of your site, and these types of pages don't get links from outside sites. So to find your Overall PageRank Power you simply need to look at the PR of your privacy policy page or your disclaimer.

Page 12: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

Overall PageRank Power is a concept only, it doesn't have any bearing on your actual search rankings, however it is useful in a number of ways. For instance if you were buying a site knowing it's Overall PageRank Power, the total amount of weight it could contribute to an existing site of yours, would be very useful. If you were planning on building a complementary site to one of your current sites, knowing the Overall PageRank Power of the current site would like you know how much benefit you could get by cross-linking them. In that manner you could also use it to decide if you want to make a complementary site. For instance if you know the current #1 site on that topic has a PageRank of "5" and your Overall PageRank Power is a "6" then you can be confident that if you do make this site and cross-promote it with your current site then it will rank well. It is also a good measure because many sites gain more incoming links to their subpages than their homepages so looking at the PageRank of the homepage alone does not give a good measure of exactly how much link weight the site has accumulated.

For more information on using your Overall PageRank Power see the our article entitled "Affiliate Gold."

Building LInk Popularity

The final thing you may discover when doing a backwards link search is specialized directories. Though they are not as popular as big directories like DMOZ or Yahoo, smaller specialized directories can be a viable source of traffic and link popularity, so be sure to submit to every one you find.

Another way to find smaller places to submit is to search on Google for the words "Add URL" or "Submit URL" (or any other variation of related text such as "Submit site") and your keywords. This will find a slew of places for you to submit your site to.

If you post in forums or newsgroups this can be another way to get link popularity. Always include a signature with a link to your URL and if a spider visits the forum, and I should note not all forums welcome spiders, every single post can help your link popularity. I have seen people get a Google PageRank of 6 simply from a large amount of forum posts. You may think this is impossible, but after a few years of visiting a forum regularly you can have a few thousand posts, and that would translate to a few thousand links to your site. Plus if you're going to be posting anyways, adding a signature is not that much work.

Another way to get incoming links is by writing articles. Most sites provide their author's with a signature for their articles where they're allowed to link to or promote their sites. Take advantage of this, if you can write for a site that is related to your content this can help a lot, if the site isn't related to your content it will still help thanks to the keywords in the links themselves. Additionally writing for more popular sites will obviously bring in more benefit that writing for less popular sites. This site for instance welcomes article submissions.

Another, albeit harder, method of getting incoming links is to get media exposure. Sometimes you can jumpstart media exposure using press releases, however luck could bring it in as well. Whether your site gets mentioned in an online new source, or if it's mentioned in an offline source that has a website, either way a link from a prestigious news source can bring in a great deal of PageRank. This is especially true if your article is on the front page when the spider happens to visit. Even relatively unknown or local news websites still usually have high PageRanks, so just because it isn't CNN doesn't mean it won't help. This is also true for getting mentioned prominently at social bookmarking sites. If you get attention from such sites you're bound to get links from the bloggers that read them.

The last thing you need to do to get link popularity is to help the search engines find the sites that link to you. It should be a daily habit to check your referrer logs. I'll admit that when your site becomes popular and you are getting visitors from thousands of referrers each day that it gets kind of boring, but it is very important. Every time you come across a new site linking to you, submit it to the search engines. You can also compare what Google and a different engine such as Yahoo show for your backwards links, and cross submit the ones that are missing.

Page 13: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

Retaining Link Popularity

Any link leading away from your site can indeed lower your PageRank indirectly. Since internal links within your site do count if you have to divide the weight of one of your pages between internal links and external links then of course the internal ones will carry less weight than if the external links weren't there. So in addition to getting link popularity it is also important to keep that popularity within your site.

There are a few ways to do this, one easy and often taken for granted method is simply to include a large amount of internal links on every page. If you include a menu of 20 or so internal links on every page then a single external link on one of those pages will take little overall rank away from the internal links.

If it is not possible to vastly outnumber the external links with internal ones then there are other methods you can use to increase the amount of rank retained in your site. You could group all of your outbound links in a single link page, which is a very popular practice. However be sure you don't put a link to the link page on every page of your site, as many do. Instead provide a link to your link page in one location, such as the bottom of your home page. You really want to limit the amount of rank you give your link page because nearly all of it will be given away instead of being recycled back through your site.

Another thing you can do is make your links in javascript or with forms. Since spiders do not read javascript or forms by making such links you should effectively stop the flow of page rank away from your site. Now using this method is fine for instances where it is not involved in a link exchange, however when you're doing this in a link exchange it presents an ethical dilemma. If someone enters into a link exchange with you for the purpose of increasing link popularity then such linking methods might be upsetting to them. In contrast if the person doesn't know what link popularity is and really just wants to get the raw visitors you will send them then they may not mind the javascript links. My advice is simply to not misrepresent yourself. If you're straight forward with everything, including showing them an example link, then your ethics are intact. If they object to the use of javascript you can always choose to not link to them or provide them with a standard HTML link. Of course, with affiliate program links or any outbound links you don't have an agreement about, you should use these techniques to help maintain link popularity. These methods, due to my water analogy below, are known as "plugging holes." Which is another concept coined by yours truly.

Another issue is that you may have pages on your site that search engines don't need to see, yet that you must link to. The solution in this case is to block those pages from being indexed with a robots.txt file or with the robots meta tag. For instance if you offer public profiles of your registered users and have 1000 users then that's 1000 pages that are sucking down PageRank that could be given to your articles. This is especially important with forums that often have dozens of different page types that search engines do not need to see.

When dealing with link popularity I find it easy to think of it like a bucket of water with incoming links being water poured in and outgoing links being water poured out. You need to be

mindful of where you want the water to flow when managing your site. You can also use your own site to "launder" the link popularity gained from other sites. For instance if you don't have keywords in your name or domain, and most of your incoming links do not contain your keywords, you can still get a benefit from this incoming links. Since those links do not have your keywords in them they won't help you get ranked higher on the search engines, however they still contribute to the overall weight of the page in question. If that page then links to another page on your site, using keywords, then you will have turned this out-of-context weight into context-specific weight. This is one reason why it is very important to use keywords and text links for all your internal site navigation, especially for links to and from your homepage.

Nofollow

A new method of telling search engines not to follow links is called "nofollow" or "rel=nofollow" simply add it to the middle of your link code, such as in <a href = "http://www.example.com" rel = "nofollow">Example</a&gtl; and search engines will not count the link.

Page 14: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

Summary

Understanding Link Popularity

Every page is treated as a separate entity.

The weight of the links matters more than the number.

Link context is very important.

Google and PageRank

Google's version of link popularity is called PageRank.

The Google Toolbar allows you to check your PageRank.

Calculating your Overall PageRank Power

Your overall PageRank Power is the total amount of PageRank a page would get if it were linked to from every page on your site.

Building Link Popularity

Check up on who links to your competition.

Submit to directories big and small.

Email link requests and link exchange requests.

Retaining Link Popularity

Outnumber external links with internal ones.

Use forms or javascript for external or affiliate links.

If you do not want a search engine to view a page use robots.txt to block it.

Further Reading

PageRank Explained a technical article showing more of the math behind PageRank for those interested.

PageRank: An Essay an essay touching on many PageRank related topics.

Search Engines and Outgoing Links we've talked about incoming links, read hear to learn about outgoing links.

Page 15: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

Link Building Methodologies

Link Baiting

Link bait or link baiting is a relatively new term that can have a variety of meanings. There has been a movement in recent years away from manipulative or artificial ways of building links and instead to focus on creating very compelling, interesting, or viral content so that people link to it naturally.

There is another side to link baiting though, and that is the publication of false, misleading, or purposefully controversial information to gain links, or the fabrication of articles around topics that appeal to people in a viral way, but do not necessarily have anything to do with your website.

Creating good content is one of the most important things that can help you gain long term success as a website publisher, so it is definitely something I recommend. Creating purposely false, misleading, sensational, or controversial content just to gain links can however come back to haunt you and is something I do not recommend. As time passes the very people you wish to attract with your content have gotten savvy to shameless link baiting attempts such as posting grandiose claims or purposely controversial statements, so rather than do what you hope they'll do (pass your link along) they instead ridicule you or your website for trying to manipulate them.

These more black-hat forms of link baiting can work, and may be worthwhile for a site that you acknowledge will have a short life span. However I would not do it with any site for which you want to have long-term success.

There is still a place for more aggressive link baiting in the white hat world, but only if it is related to your site and only if you're not lying to your readers.

Considering how a broad definition of link baiting could encompass almost all of the tips in this article, I will instead use a narrower one and cover the other possible applications of link baiting in their own section.

Directory Submissions

One of the most straightforward ways to build links is by submitting your site to directories. We all know the big directories such as DMOZ, Yahoo, or Best of the Web, and you should submit to these sites. But you should also consider submitting to various small or niche free or paid directories.

Free directories are a no-brainer, except for those that require a link back in order for you to be listed. I recommend avoiding these directories altogether. Search engines will not penalize you based on who links to you, but if you link from your site to a known spammer or so-called bad neighborhood your placement in the SERPs (search engine result pages) could suffer. Most of these "link back" directories exist only to gain link weight for themselves and are not worth your time.

Paid directories can often be worthless as well. Many people launch paid directories purely as a way to make temporary money, they will promote the directory and try to inflate it's Alexa rating or PageRank to attract submissions and then move their promotional efforts to yet another new directory, thus executing a bait and switch maneuver. Many also have been banned or penalized by search engines for the practice of selling links without editorial review.

Page 16: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

Prices on these small directories can range from a few dollars as a one-time fee, to as much as $50 annually. There are literally thousands of such directories out there, and picking valid ones to submit can take some time and work and knowledge. We've created a list of legitimate directories here, but it is far from comprehensive. So when evaluating a paid directory to see if it is worth your time consider the following factors:

What is the Google PageRank or Alexa Rating of the Directory? The higher the better, and never pay for a directory listing if it has a Google PR of 0.

Check the backlinks of the directory, where are they from? A varied collection of real sites, or footer links from other sites the company owns or has bought ads on?

How long has the directory been active? You can check Whois.sc or Archive.org to get an idea. The older the better.

What do the other submissions look like? Are there unrelated submissions in many categories? Do any of the listed sites look like spam to you? If it looks like a directory of spam sites or unrelated links, you do not want to be associated with it.

Price is always a factor, so keep that in your mind as your make your evaluation.

If you know the name of the company that runs the site (if they don't tell you on the site you can find it with Whois usually) do some research on the company. Do they run dozens of cookie-cutter directories or just this one? Obviously you'll not want to bother with the company that runs dozens of directories as they're just out for your money.

You can use sites such as DirectoryCritic.com as a starting point for your search for directories to submit to.

Finding Submission Locations via the Search Engines

The search engines you are hoping to rank well on can be one of your most helpful tools in finding new places to submit your site. Simply search for your keywords plus the phrases "add url" "add site" "submit url" "submit site" "add link" "submit link" "suggest url" "suggest site" or "suggest link". And you should be presented with a list of locations that accept submissions related to your topic. These can be niche directories, or merely link pages on content sites. Some places may have automated link submission forms you merely need to fill out, others will provide an email address. So long as the submissions are free and do not require a link back you should make a submission.

Another thing you can do as well is search for the backlinks of your competitors. The standard syntax of this is typing "link:http://www.example.com" into a search engine. You should do this search on multiple search engines so you get a more varied and complete data set. This practice is generally referred to as link research.

Browse through the links once you find them and look for ones that would take submissions. Even if it isn't evident that submissions are welcome consider dropping a friendly note to the webmaster of the site (but only if they solicit comments, don't just pull an email address off of WHOIS information and assume it is okay to email them) and ask them to consider adding your site. Be honest and forthright and don't promise anything in return. You may get a link, you may not, but you'd be surprised how often you can get a link in this way without having to link back or pay.

For any of the above submission methods you need to realize that it is vitally important that your website be complete, and with high quality original content. Yes, you can easily get free links from the places I've mentioned, but not if your website is generic-viagra-for-less-and-mortgage-loans.com. Do not try these submission methods if you're making a poor quality short-term site to spam with, it will not work and not be worth your time. These methods are only applicable for quality content sites that won't immediately be turned away because of their low quality content.

Page 17: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

Viral Marketing & Social Media Optimization

Viral marketing is the creation of content so compelling that every visitor who sees it refers at least 2 more people to it, thus creating a traffic pattern that mimics the spread of a virus.

Social media & bookmarking sites such as Digg, Reddit, Del.icio.us, and others have changed the way viral marketing is often done by providing a built in audience to the viral marketer so that all the viral marketer needs to do is get his page on these sites.

At the most basic level social media optimization would be the use of buttons, such as on this site, to allow visitors to easily "bookmark" your site using one of these services. This is passive optimization and can result in big traffic if an article catches on, but more often than not it doesn't.

A more active method of optimization would be purposefully crafting an article to appeal to the audience of one of these sites. For instance Digg has a largely male, largely young, and largely technophile audience, they also like short, to the point, interesting articles. Already people have, in jest mostly, published formulas for getting a page well received on a site such as Digg based on the types of articles that normally do well there and the demographic. So the typical Digg article is a top 10 or other such list of interesting & somewhat nerdy facts. Perhaps the top 10 ways to trick out your PC, or the 5 most promiscuous comic book heroines.

This is fine if your site is about PC tweaking or comic books, or any other topic you could massage to fit the Digg audience, however often people will make content completely unrelated to their site in hopes to gain link weight through social media popularity. This would be considered by most to be spam and the people who heavily use social media sites are growing hip to it and so you run the same risk as you do when doing scandalous link baiting, namely a big backfire that blows up in your face.

Another way to do viral marketing through social media is to commission a funny video be made and submit it to YouTube or another video sharing site and either through product placement or a byline at the end reference your website or product. Large billion dollar corporations do this method now, but the beauty of it is that unlike a Superbowl ad you do not need millions of dollars to do this, you just need a cheap camcorder and an idea. However, people prefer authentic content and not content made to promote a business, so this as well can backfire. However, perhaps due to the public's desensitization to advertisements in video (see TV) this type of activity is seemingly not frowned upon as much as the social bookmarking manipulation mentioned above.

Regardless of what you do, the purpose of viral marketing or social media optimization is not to gain links directly from a social media site, but to build buzz and or interest in your site among bloggers and other content creators so you can gain thousands of new incoming links from a wide variety of sources.

Buying Links

There is nothing wrong with buying links to promote your website. Link buying has existed longer than link popularity search engine algorithms, it is a valid form of advertising. However like many other methods it can be used to spam.

There are many link brokerage services out there, none of which I use as either a buyer or a seller. Most of these sites do not appropriately value high quality links and instead deal more in bulk low quality links from crappy sites. When buying a link you do not want your site in the footer of some forum sandwiched between a link to a viagra site and a link to a car insurance site. What you want is a link on a site that closely mirrors your site in topic. You'll want your link to be one of a few links, ideally the only link, and you want the link presented in a way that makes it appear to be more of an endorsement than an advertisement.

Page 18: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

There are different types of links you can buy, often you can buy a link on a single page, or you can buy a link on every page of the site, called a site-wide link. There has been speculation (and so far it is only speculation) that search engines may devalue site-wide links. So try to follow my rule for assessing link values.

Chris's Rule For Link Buying

1000 links (site wide) from 1 site is better than 1 link from 1 site. However 1000 links from 1000 different sites will be better than 1000 links from just 1 site.

Always strive for diversity in the sources of your incoming links, however that doesn't mean you should turn down a site wide link if it is an attractive offer and the site is closely related to your own. The site wide link may not help you as much as the same number of links from many different sites, but it certainly will not hurt you.

Link Exchanges

You'll not find a "Links" page on any of my sites. I do not do many link exchanges; at any given time among all my sites I could count the number of active exchanges on my fingers.

The reason is for the most part they are not useful, and the link exchange "industry" is rife with scammers and spammers.

For starters a link from an unrelated site, especially one you have to link back to in order to get, is going to be worthless. You'll get nothing context wise because the site in unrelated, and any weight you get will likely be sent back through your return link. Depending on the two sites in question, and link page setup, you may even end up losing more weight than you gain.

Links from on topic sites are better, but if it is just from a link directory or links page again you're dealing with a context issue.

The above types of link exchanges may be easy to get because there are a lot of people with links pages that they play fast and loose with, but they aren't worth your time.

Rather instead focus on in-content link exchanges. Such as one site will mention the other in an article, and the other site will recommend the first site as an additional resource to it's readers. Also consider deep link exchanges for specific pages of content. For instance if I ran a site on classic literature and had a page on Julius Caesar that listed the non-fiction works he authored. I might accept a link exchange with a site that focuses more on the rest of Caesar's life or the early Roman Empire. I would not seek for a link to my homepage, but rather I would want a link from their specific Caesar page to my specific Caesar page, and vice versa. The context of such a link will be stellar and should greatly help both pages rank well.

The other problem with link exchanges is that many people are dishonest and will take your link down without telling you, or hide or obfuscate it from search engines. So I prefer to only do link exchanges with reputable sites or with people I know or people I at least know of.

Reaching out to Bloggers

Blogging is a very ego-driven endeavor. There are hundreds of thousands or millions of blogs out there and most aren't read by any large number of people, and yet people still write their blog because they like to think they are important, they like to think they matter, and they like to think people want to read what they have to say.

For some sites or specific industries, and for many ecommerce sites, you can exploit this by reaching out to bloggers for promotional purposes. Old-school brick & mortar promotion is sending out press releases, buying newspaper advertisements or TV ads, maybe hiring a PR firm. Web 2.0 promotion is getting bloggers to talk about your product or site to build buzz and generate traffic and links. The Web 2.0 method also happens to be cheap.

Page 19: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

If you have a website or product you are selling that a blogger could possibly review, send them an email asking them to review it. If you're selling a product it is easy to sweeten the deal as you can simply give them a free copy in exchange for the review. Bloggers like feeling that they're important enough to be picked by a company to review something and get free merchandise, but they don't like feeling bought, so stress that you're not seeking a positive review, but an honest review.

The review the blogger writes will undoubtedly include a link back to your site, it may spawn blog posts at other sites, and will send direct traffic in addition to helping with your search engine rankings.

You'll want to find the most popular independent blogs for this sort of thing, the easiest way is to search on Google for your keywords and "blog" and see which blogs rank at the top. If you cannot find a contact email leave a comment on a recent post and ask them to email you.

Assuming your product doesn't really suck you'll probably get a good review anyways. People by nature are good and if you're nice and respectful to the blogger and give them a free product they'll feel guilty if they write anything too bad. Even if they do end up not liking the product they will likely say that it is just their opinion and they could see how someone else would like it, and in the end there is no such thing as bad publicity.

Forum & Blog Participation

An easy way to gain incoming links, though those of questionable value, is to participate at related forums and blogs. Almost all blogs and most forums allow posters or commenters to leave a website link or have signature links and you can gain direct traffic through these links in addition to gaining link popularity.

The key though is to participate in a real fashion. There is software or services out there that will automatically post rubbish at forums and blogs to get you links, this is spam. There are also people out there that'll do such posting manually but only achieve a level of quality a sliver above the posting robots, and that's still spam. You need to participate, not just post. You cannot simply say, "Yes, I agree" and hope to not be perceived as spam.

The best advice I can give in this regard is to try not to post anything you wouldn't post if you weren't trying to get incoming links. The idea of gaining links from forum signatures and blog comments should be more about getting some links & traffic for an activity you'd do anyways, rather than an activity you're only doing for the links. Blog & forum owners tend to be savvy to this sort of behavior and may end up banning you for it if you're not being genuine.

Also, remember, the more helpful you are the more people will respect you and the greater exposure your links will get.

Using Yahoo Answers

Yahoo Answers is a peer-to-peer social media question answering service. Users can ask and answer questions and answers are then voted on to rank which one is best. They have warnings all over the site about using it for promotional purposes, but so long as you're actually being helpful you can get by with some promotion. The best part is other search engines index these questions and answers, so you can get on topic incoming links in addition to the direct traffic.

When answering a question, be helpful, include real information, and then you can either include a link to your site in the "Source" area (assuming your website is a source for the information) but also often post questions asking for links to sites, and if your site fits the bill you can just post a link to your site as an answer.

I can usually spend a half hour answering questions and end up with 10 links, and many of my answers will even be voted as "Best."

The thing is though, just with blog & forum participation, you do not want to force it. If you cannot find a question that your site's content can answer, don't fudge it and post a link on an only slightly related question, instead just try again tomorrow.

Page 20: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

Another thing you can do, which is decidedly gray hat but is hard to catch, would be to do an exchange with a friend. Have a friend post a question tailored to your site, and you provide the answer. Then you post a question tailored to his site, and he can answer. Or, if your site is a true authority for a topic, you can simply post a question tailored so well that a user you don't even know could end up posting a link. You should still follow all my other advice about making it look genuine and posting real helpful information of course.

Using Press Releases

A press release is a short informative notice sent to members of the press about your business or a new product or service you are offering. Services like PRWeb have made it very easy to create such a press release, vastly lowering the barrier of entry to this type of promotion for small businesses.

You shouldn't have any dreams of being contacted by all these media sources for interviews after sending out your press release. That is a long shot, most press releases are ignored simply because there are so many sent out on a daily basis.

However, there is always the slim shot that you could hit just the right nerve with an editor to get a media mention. This can be easier if you live in a small or rural area. Knowing of an Internet business in their local area can give small rural newspapers or other news outlets a go-to-guy for Internet related news stories. You could also peek their interest for one of those fluff pieces about successful entrepreneurs.

However, when you release a press release through PRWeb it will always gain you incoming links because PRWeb itself will list the release in their directory and places they syndicate to will as well. It may not bee a huge amount of incoming links, but they will be incoming links.

So, when doing a press release, its okay to hope for some media mentions, but realistically you should only expect the link from one or two places just listing the press release.

Writing Articles

Writing articles for sites that allow you to have a signature link at the end is a good way to gain incoming links, however it isn't for all sites and like most other link building methodologies there is a darker side to it.

With almost all Internet marketing techniques there is the easy way, and the hard way, and the hard way is almost always better. In this case many article directories or article submission services have popped up to cater to the lazy out there who do not want to spend the time looking for actual real places to submit articles. These article republication sites offer almost no value in linkage, no traffic, no link weight, nothing.

When you submit to these sites your article will be republished in countless other worthless places and in the end any signature link you have will end up looking more like spam than anything else.

Yet, content site owners need content, and article writers want good incoming links, so there must be a way to bring them together, right? Yes, but it involves work on both sides.

Good content sites get that way by not publishing every horrible 500-word fluff article that barely scratches the surface of useful information that comes their way. Good content sites get that way by not publishing content that is published in a dozen other places. Good content sites get that way by exercising editorial control over what they publish and by seeking out unique content.

So, there are links out there to be had on good content sites, but you need to be willing to put in work for them. That means writing a useful unique article for publication. You may even be able to negotiate payment though most webmasters understand the value of a signature link from a good content site and will take the value of the link into consideration with any negotiation for payment.

Page 21: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

Another thing you do not want to do is write an article that reads like a press release for your business, if you want to write a press release see the previous section. Webmasters who exercise real editorial control are savvy to such tactics and will likely reject your submission.

So, you see, with article writing you get what you pay for, or rather what you put the work in for. If you spend no time or effort in writing an article you can get it published on generic article directory sites or poorly ranked content sites that no one uses or considers an authority, the links from these sources will be next to worthless. Or, if you put the work into writing a good article you can get it published on a good authority content site that probably has a good deal more link weight & traffic to pass your way.

The best thing you could try to do, especially if your site is complimentary rather than a direct competitor to a content site (such as an ecommerce site) is work out a regular contribution deal where you provide one good article every month or two to a content site webmaster. A reliable regular contributor is valuable enough that the webmaster may allow you a little more freedom in promoting your business.

Using Referral Based Affiliate Tracking

Most affiliate programs or other types of scripts (such as top sites lists, or anything that ranks sites based on the traffic they send you) rely on tracking code appended to the URL. This makes it unlikely that the link will provide you any link popularity, as the URL is so different than URLs normally used to access your content.

However, with some server side programming that checks the HTTP_REFERER variable you can instead allow people to just link to your site normally and get credit for those links. Its true, this type of tracking would not work for email marketing, it would not work for people who have software that blocks this field, it would not work for RSS feed based promotion, but on the other hand it would work well for anyone who runs a site with user submitted content. If you ran a popular forum and were a member of an affiliate program that did this you could get credit for every link your users make on your forum to this site, without any tracking codes needing to be added to the links.

The biggest benefit though is to the site doing the tracking. Regardless of what type of site you have, having such links devoid of tracking codes will make them look more legitimate and will make sure they provide the full link weight and link context bonus.

For a site with an affiliate program using this is straightforward, you just change your tracking. However it isn't just sites with affiliate programs that can do this. Also sites with traffic ranking top sites lists can do this, but also directories. Right now many directories require reciprocal links to be listed, which makes them look spammy. Rather do not require reciprocal links to be listed in your directory, but provide this sort of a referral system for listing enhancements.

Still, the largest benefit will be to those sites with affiliate programs. A site with a popular affiliate program can have thousands and thousands of incoming links that aren't being counted because of complicated tracking codes & redirects. Doing this would instantly give you a huge number of real incoming links instead. Additionally it may help control publisher abuse, as they'd only get credit for promoting your site on domains you have approved, and they wouldn't be able to use email spam to promote your business.

For more on this topic please see this blog post.

Building a Good Site

I saved the best for last so that you'll take it with you when you're done reading this article. I also realize that I touched on this at the beginning in my talks about link baiting, but it is so important that I need to touch on it again.

Gaining incoming links can be easy, or it can be hard, it depends entirely on if your site is good or not. If you put no work into producing a quality site and instead merely plan to focus on promotion you will not have much success. You cannot promote a lemon. If your site is not finished, or doesn't have enough content, or otherwise has a quality problem with it hold off on promotion until it is fixed.

Page 22: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

Every single method I've mentioned in this article will be easier, or more fruitful, if you have a high quality website with good unique content. You will also gain more links through no effort on your part if you have a good site as people will naturally link to and recommend your site.

I've provided a fairly comprehensive list of things to try when working on building your link popularity, however if there is one thing I want you to take away from this article, it is that the most important thing is building a quality site. Make a site that people will love, and they will love you for it.

For more on building incoming links, or questions about anything in this article, please see our forums.

On-Page Search Engine Optimization Techniques

On-page SEO is anything that you do to your site files themselves in order to rank better in the search engines. This differs from off-page SEO, which is primarily concerned with gaining links to your site from other sites.

No type of SEO is more important than the other. You need both on-page and off-page SEO to rank well, the best on-page SEO isn't going to get you anywhere without good incoming links. And the best incoming links in the world can't help you if your site is crawler poison. However, there is an upper limit you can reach with your on-page optimization where you time will be then better spent on off-page optimizations.

What follows will be an overview of fundamental on-page search engine optimization techniques.

Your Title Tag

The single most influential piece of real estate on your site is your title tag. Reaching back to the article on choosing a domain (link), by using keywords in your site name you can easily ensure that they will appear in your domain, many of your incoming links, and in your title tag, all of which is extremely important.

If you did not name your site with your keywords you will need to figure out what your prime keywords are and use them in your title tag as well. Such as in the slogan most often put after your site name.

As an example, if you ran a web development site called WD City you would want your title to read "WD City - Web Development Resources" or something similar. Naming your site with your keywords is best, but if you did not you must make absolute certain the keywords make it into your title tag. Failing to put your prime keywords in your title tag will significantly hinder your efforts to rank with that page for those keywords.

Further into your site I recommend a more specific to less specific approach. So you will want to write your title tags along the lines of "Page Title - Section Title - Site Title." Optionally leaving out section title if you either want to keep it shorter or if it isn't applicable. Again too, naming your pages or articles with your keywords makes it easier for you to include those keywords in your title tag, which in turn helps you rank for those keywords a great deal. Often people will want to name their articles using clever metaphors like what you may find in many offline magazines, but for SEO purposes you want to be very specific, literal, and keyword rich. So, in the realm of SEO, if you had an article about taking better landscape photographs, you would want to title it "How to Take Better Landscape Photographs" and not something "clever" but ambiguous like "Take Better Pics of the Hills & Sticks." Being literal in your titles and topics will help your search rankings flourish.

Title tag space is not infinite, search engines may stop reading it after a certain length, and regardless of that for each word within it all the other words are weighted less. Always think of percentages when doing this type of SEO. If your main keywords are 50% of your title tag, that is good. If they're merely 5% of the tag because you've stuffed it with many minor keywords and other unnecessary language, and that isn't good. There is no magic keyword density for any aspect of on-page SEO, so worrying about hitting an exact percentage is a fool's errand. However, you do want to be aware of your density, especially in high value areas like your title tag. You will want to weigh adding additional minor keywords against the affect they will have on the density of your main keywords.

Page 23: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

The final issue you must remember when optimizing your title tag is that in addition to being the highest weighted bit of on-page search engine optimization, your title tag also forms the link people will be clicking on to reach your site from the search engines. As such you want to make sure it entices people to visit by avoiding the appearance of being spammy. This is also why we list the most specific information first in the title because that is likely what the person was searching for.

Meta Tags

Meta Tags have a somewhat controversial past. In the 90's they were the main criteria for search engine rankings, however because they were so easily abused by webmasters they were all but dropped from use. However for years many refused to believe this and the SEO world was divided into two camps, those who believed meta tags were still a major factor, and those who didn't. Now though I think mostly everyone has finally come around to accept that meta tags are one of the least important aspects of optimizing your site.

However, being hardly important does not mean they aren't worth doing. The thing is you merely do not want to overanalyze or waste your time worrying about them.

The first meta tag is the META DESCRIPTION tag formed as follows:

This tag is actually somewhat important still as a search engine may use it to form an abstract for your site's SERP (search engine result page) listing if they cannot generate an on-the-fly abstract off of your visible content. So primarily you want this meta tag to accurately describe your site in a way that will encourage clicks from the search engines. Include your keywords of course, but your focus is more on clicks with this tag. Also, remember, that just because a search engine uses it for an abstract doesn't mean it provides any actual ranking benefit.

The second meta tag is the META KEYWORDS tag formed as follows:

This tag is used for providing a list of keyword suggestions to search engines with which they can rank your page, except most search engines ignore this tag nowadays. Still, you can easily make one in a few minutes so it is recommended you do so. Simply create comma delimited list of keywords and keyphrases and include them in the tag. Do not bother with repeating words, do not fret or stress about doing it well. Just get it done. The one piece of advice I have is to try to include less common words that your site is about but might not make it into your visible content with any regularity.

Creating Search Engine Accessible HTML

The markup on your site should be clean, clear, and as limited as possible. Pull open the source code of a page on your site and check out how it looks, if it is a mess of HTML code you might want to trim that down or switch to a lighter markup specification such as with CSS. Again, here we worry about density and extraneous code could potentially hurt your keyword density.

Many believe you must have completely validated code as well or the search engines will penalize you. This is not true. The Search Engines are not markup grammar police, they care not if your site validates strict. However, clean, validated, accessable, cross browser compatible, quickly loading, markup is a very good thing because it makes your site more usable to more people and provides them with a better experience when using your site. This will cause people to enjoy your site more and make them more likely to link to your site or recommend it to others. So yes, there is value in having good clean CSS markup, as there is value in having a fast server to provide quick page loads.

So, through out your HTML markup, remember, less is more. If you can do a pure CSS design do it, and leave naught but tags in the page source itself, and move all style definitions to an external css file. Additionally if you use javascript also keep it in external files. This will not only speed up page loads through the use of browser caching, but also of course keep your keyword density in your source code as high as it can possibly be.

Page 24: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

How Spiders Read

Spiders read your source code from top to bottom, and content placed at the top tends to be given more credence than content at the bottom. Additionally some crawlers may stop reading pages after so many kilobytes. So you really

need to prioritize your content.

Most websites have either a top horizontal, left, or right menu or all of the above and depending on how your structure your site, be it with HTML tables or with CSS, some of that content can end up at the top of your source code, some of it at the bottom. You will want to make sure what is at the top is what you want to be given the most notice.

Spam Warning

There was an old technique of filling up HTML Comments (which are invisible on the rendered page) with keywords to rank better in search engines. Don't do this, if the search engines don't just block the keywords they'll penalize or ban your site for doing this. This is no different from putting in text the same color as the background, so called invisible text, to increase keyword density. Any technique that tries to add keywords to your page that users do not see, apart from the use of meta tags, is spam. Using any such technique risks penalties & bans.

What I recommend is making sure your menu is at the top. The simple reason is that no menu is likely to ever be long enough to prevent your content from atleast being partially spidered, but content can easily be long enough to prevent your menu from being crawled if the content is placed first in the source code. If your menu is not crawled a search engine will not be able to crawl your site as deeply and you will end up with less pages indexed. As such it is better to have 100% of your menu crawled even if it means that only 90%, 80%, or even 50% of your content on one page is indexed.

ALT & TITLE Attributes

Two key accessibility features that are part of the HTML specification are ALT attributes for images and TITLE attributes for links. The ALT attribute is meant to provide a description of an image for the blind or those with images turned off in their browser. The TITLE attribute is meant to provide a description of the linked-to page to give the browser more information before they click the link.

Both locations are prime places for the placement of keywords. However this does not mean that you want to stuff them with keywords, stuffing being the process of repeating keywords repeatedly in a nonsensical way. Rather instead you want to use them for their intended purpose, and just be very literal in your descriptions.

For a TITLE attribute a poor example would be "Click for more information." That is completely generic, does not provide good information to the browser, and helps you not at all with search engines. A far better approach would be "Information on Panoramic Photography" in which you eliminate unnecessary noise words (click for more) and add the literal prime keywords of the linked to page (panoramic photography).

For an ALT attribute, most often used with your site's logo, a poor example would be "Our Logo." Instead it would be best to more or less repeat what is in your <title> tags. For another image on the page, lets say within an article on panoramic photography, a bad example would be "A picture of a camera" whereas a good example would be "Panoramic Photography Camera with Wide Angle Lense" again because I have removed the noise words (a picture of) and added a more literal and complete description.

Neither the TITLE or the ALT attribute will provide a large benefit to your rankings, although the ALT attribute can make a noticeable difference in image search result sets. However they are relatively easy & straightforward to implement and you should use them for accessibility reasons alone.

Page 25: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

FRAMES

HTML frames were once a really good technology for controlling page downloads. When people were connection at 9600 baud it was a pain to have to reload the header, menu, or footer, on every page load. Frames fixed this problem. However, since then bandwidth has increased, and there are better technologies that accomplish the same thing. Server side includes & other server side scripting platforms provide the same maintenance benefits to the webmaster. And things such as external CSS files provide the same bandwidth savings to the browser.

However frames remain poorly accessible. Search engines still have problems crawling them, users have trouble with page reloads & with bookmarking.

As such I can safely say that you shouldn't use frames. The only time I would entertain their use is in a web based application that you must log into first to access. In those cases neither bookmarking nor SE crawling is desired and the use of frames can provide some programming benefit.

The exception to this is iframes. Iframes or inline frames are a good technology for use when you're including content from another server within your site. The reason is that the iframe prevents delays in the other server from slowing down your page load. So by all means, use an iframe to serve advertisements or other extraneous content and you do not want slowing down how fast your actual content appears on the string. You don't much want SEs to crawl that content anyways.

You can also use iframes to block search engines from crawling sections of a page. For instance if you had something like a tag cloud that you worried would look like spam to a search engine you could serve it in an iframe from a file that is blocked by your robots.txt file. In this way a search engine will ignore that part of your page.

Javascript & DHTML

As the capabilities of HTML & CSS have increased the use of Javascript & DHTML have decreased to an extent, and that is a good thing. However you still need to be aware of issues with these technologies.

Search engines do not read javascript, and so yes, you could block content from being crawled by printing it with javascript much like I mentioned with the use of iframes. The problem with javascript though comes when people use

it for content they do want crawled.

For instance many people like to use fancy interactive javscript & DHTML driven menus. These can sometimes be okay, but most often they are not. If the menu simply uses javascript to move around standard HTML anchor tags, then a search engine can usually still appropriately crawl it. However many such menus rely on javascript's on-click event handler instead of HTML anchor tags and search engines cannot process such things. In those cases your menu will end up being ignored by the search engine, which will make it difficult for it to crawl the rest of your site.

Quick Tip

You want a fancy graphic menu, try using a method to fake it so you still get the textual benefit. More on that here: Faking Graphical Links.

The only other issue with javascript is one I already mentioned, namely keeping it in external files you merely include into the HTML document. Doing this keeps the source code to a minimum, helps speed up page loads, and helps keyword density.

If you do use a DHTML menu you should include a proper text one either in your footer or in <noscript> tags.

Proper Use of Flash

Like javascript, Flash is another technology that search engines cannot crawl. There have been strides made, and there exists a way now for search engines to crawl links placed within Flash files, but not all do and the content, which tends to be mostly graphic based, still does not get crawled.

Page 26: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

As such I recommend not using Flash for your website, and especially not using one of those Flash splash screen as a welcome to your site. Your site homepage is the highest weighted part of your site in most cases, and if all that is there is some Flash welcome movie then you have almost no spiderable content, ranking for any keyword will be extremely difficult.

If you must use Flash the proper & accessible way to use it as an optional viewing platform. Have the default site be HTML or CSS based and then either detect Flash with javascript and open a new window for it to play in, which the browser can then close if they wish. Or provide them with a button to launch your Flash version. Do not force it on them, and do not make it the default for viewing your site, or you'll be feeding the search engines poison.

Of course Flash is a great technology if you're using it to display content you don't need the search engines to index, such as games, advertisements, or interactive tools.

FORMS

Forms are necessary for allowing browsers to interact with web pages, but like with other technologies search engines do not fully support them. Specifically a search engine will never submit a form on your site. So, if the only way to reach your content is through submitting a form, a search engine will not reach that content.

This can be used to your advantage. If you have a link you do not want a search engine to follow, making the navigation form based and then dressing up the form submit "button" to look like a link with CSS is certainly an option.

If there is content you want a search engine to crawl but users typically access through a form, you will want to provide text based navigation to reach that same content. For instance many large information sites with huge databases of content or statistics typically serve all that content to visitors via a form based internal search engine. However, to get search engines to crawl every bit of content as well, it is necessary to build a method of browsing through the entire library using simple text links.

Heading Tags, Fonts, & Sizes

Heading tags, specifically <h1>, can be used to denote important content on your site, however just like with your title tag you must think in percentages.

For instance you could use CSS to make your H1 tag look like normal text and then use it for each paragraph of content. However, that would be less beneficial than just putting a few choice words in H1 tags as when you put an entire paragraph in them the keyword density for any one word is going to abysmally low.

So, do not try to do anything tricky like that, the search engines well ahead of you on that point and it will not work. Instead use the H tags as they were meant to be used.

Use H1 for the on-page title (such as the article title at the top of this page). Use H2 for secondary content sections, H3 for tertiary ones, and so on down the line just as you would if you were making an outline.

You can of course use CSS to change the appearance of your tags, to make the fonts smaller or larger or a different color, this will not affect you rankings. The content doesn't have to be a big font, just appropriately used as a header and with good keyword density.

In years past search engines also gave higher weight to content found in <b> or <strong> tags or other tags that indicated emphasis. It is hard to say if they still do, since it is nearly impossible to scientifically test, however even if they still do this any such bonus would be small and not worth worrying about. So again, use the tags as they were intended and do not think to try to eke out a benefit by overuse or through manipulating them to appear like normal text when they're not.

Page 27: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

Keywords & Writing Effective Content

The one of the biggest factors in your search engine ranking is going to be your keywords, or what words appear on your pages. Choosing the right keywords is vitally important, but we've already covered that, instead this section will deal with how to include those keywords in your writing. While search engines once used to rely on keywords found in your meta tags to rank your page they now have more sophisticated software that analyzes the text on your page and not only checks for keywords or phrases, but also checks that they are in a proper form and not just repeated over and over again.

Keywords have always been an integral part in determining your search engine rank. In early days the search engines would simply count how many time a word was used on each page and whichever page used it the most would get the top listing. This resulted in people repeating words thousands of times on their pages, usually with a text color the same as their background to make the words seem invisible. Now not only do search engines ignore, or even ban, sites that repeat keywords, they also will do it with sites that have text the same color as the background.

Keyword volume or density is still important though, and so you must try to include both your main keywords (the ones you most hope to rank well one) prominently, as well as as many minor keyword as you can.

For instance consider the following passage:

Welcome to Ray's Photography Resource. Photography is a very complicated skill and you don't always know or remember all of the techniques you need to produce a better product. Or maybe you are have trouble keeping up on industry news and recently developed new techniques. If so you can find the answers you seek here. We have articles on many different topics from this industry, and we add new ones every week. We can help you find the right equipment with our extensive reviews section. Finally we can offer you a place to display your work for others to see. Whether you're a professional or amateur we have what you need.

On the surface it sounds fairly typical but from a keyword perspective it is horrible. The only place I even mention words related to the topic of the site is in the first two sentences, the rest of it could have been written about anything.

Now consider the following passage and notice the difference:

Welcome to Ray's Photography Resource. Photography is a very complicated skill and you don't always know or remember all of the lightning or developing techniques you need to produce better photos.. Or maybe you have trouble keeping up on photography industry news and recently developed photo editing processes. If so you can find the answers you seek to your photography related questions here. We have articles covering all facets of photography, from lighting, to film, to lenses, to editing, and we add new ones every week. We can help you find the right camera, lights, film, or editing software with our extensive reviews section. Finally we can offer you a place to display your finished photos for others to see. Whether you're a professional photographer or an amateur we have what you need.

The second passage has significantly more keywords and more repetitions of those keywords. What you need to do is model your content like the second passage. Try to fit as many keywords as many times as possible without resorting to simple repetition. People will hopefully use more than one word or phrase to find your site so make sure you get some variety in there, just so long as it relates to your topic.

Squeezing in Minor Keywords

It is really difficult to rank on a page for a keyword that does not appear on the page. Often if you have a keyword mentioned just once on a page it can make the difference between ranking in the 100s or a ranking in the top 10. The concept of the long tail that I previously talked about is applicable here as well. For many sites there can be hundreds, or thousands, of obscure search terms they could rank well for if they just had a few minor keywords on more pages of their site.

Page 28: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

One easy way to make sure you always get your major and minor keywords on your site is to use a boilerplate sentence or two as a site description. Typically you can find these in the footer of a site along with a copyright statement. Such as follows:

Site Copyright Acme Inc. All Rights Reserved. Acme Inc is a provider of shipping supplies, commercial mailers, packaging materials, overnight services, and more in East Lansing, Michigan. We proudly serve Ingham, Clinton, & Jackson counties.

You don't want to make it too long as Google reps have stated that long boilerplates can end up being negated as duplicate content. However a simple one or two sentences can serve to get a dozen or so prime keywords onto every page of your website. Don't worry about keywords found in your site name or other words that already end up on every page. Instead focus on a selection of minor keywords that do not always get mentioned. If you need help picking them, go back and do some keyword research.

Wrap-up & Further Reading

On-page optimization is, in a way, less important than off-page optimization. Or, less time consuming in the long run at least. The reason is that your on-page optimization can only be so good, whereas there is no upper limit on the number of quality incoming links you can find. So follow the tips presented in this and my other articles, however if you've done it all and still are not making progress start instead to focus on the off-page factors instead of overtly stressing yourself out over your on-page optimizations.

The one on-page topic I did not heavily cover in this article is menu optimization, internal site linking, and or site architecture. This very important topic is deserving of it's own article and so can be found here: Site Architecture: Optimizing your Internal Links

Below you will find a checklist summarizing the main points of this article, this will be a good place to refer back to as you work on your site. Also, as always, if you need any help with anything mentioned here please ask in our webmaster help forums.

On-Page Optimization Checklist

1. Descriptive literal title tag that includes your keywords. 2. Descriptive literal meta description tag that includes your keywords and entices clicks from

readers 3. Meta keywords tag that includes your main keywords, minor keywords, misspellings, and words

you may wish to target but that are not commonly found in your content. Don't stress out too much when making it.

4. Clean, coherent, HTML or CSS markup with as little markup coding as possible, externally included files for CSS definitions. Fast loading, accessible, and compatible.

5. Source code formatted so your most important menu items are read first. 6. Descriptive literal ALT & TITLE attributes for images & anchor tags (links) that include your

keywords. 7. No use of frames for public content you want search engines to see. 8. Conservative use of javascript, do not use it to display important content. 9. Conservative use of Flash as an option for viewing your site, never a requirement.

10. Normal HTML based navigation alternatives if form based navigation must be used. 11. Descriptive literal heading tags (h1, etc) that include your keywords with good density and are

used properly to sectionalize content. 12. Effectively keyword rich content that is descriptive, literal, and goes into great detail. 13. A fully optimized and accessible menu.

Page 29: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

Site Architecture: Optimizing your Internal Links

The structure of your site has a lot to do with how well you'll rank in the search engines. Now I'm not talking about HTML or any actual code, I'm talking about how your site is held together by its internal links. The navigation schemes you employ can drastically affect your ranking one way or the other and it is something that you absolutely must pay attention to if you want to make it to the top of the search engines.

Building a Useful Menu

The most important part of your site, more important than your actual content, is your menu. Yes, your menu is more important than your content. Why is this? Well without a usable menu a search engine may never find your content.

In order to keep your menu search engine friendly, that is so that a search engine can follow the links presented, it is important that you use normal HTML anchor tags to make the links. This means that fancy DHTML menus that use javascript onclick event handlers are not a good idea since they are not read by search engines. However, if your DHTML menu simply moves normal HTML anchor tags around then you're fine since a search engine can still parse that link out of your source code. Also Flash is not yet read by all major search engines so you should avoid using a flash based menu.

If you absolutely must use a menu that search engines cannot read you should include duplicate plain text menu somewhere on your page, or in noscript tags.

In addition to using normal HTML anchor tags you should put text, not images, in those tags. The reason is that the anchor text used to link to pages is a vital part of the ranking algorithm of many search engines, notably Google. So, if one of the pages on your site linked to from your menu is about "Golf" then linking to that page with the word "Golf" will fetch far greater a benefit that linking to that page with the picture of a golf ball.

Most menus also include a link to your home page with words like "Home" or "Index." If you're trying to optimize your index page this is a bad idea, you'll be much better off linking to your home page using text like "Keyword Home," the benefit from this one small change can be substantial.

So, building a search engine friendly menu is paramount to search engine success and populating that menu with keyword rich text links, including the link back to your homepage, can make a substantial difference in your ranking.

Building Useful URLs

How you construct your URLs can also make a large difference in your site's success. Specifically some URLs may prevent your site from even being indexed. So it is important that you implement search engine friendly URLs. Since this has been covered in depth in the article linked to it will not be repeated here.

Additionally you will want to create effective URLs with meaningful identifiers and delimited keywords, that topic is covered in this article.

Sitemaps

If your site is larger than 10-20 pages you may also want to consider creating a sitemap that links to the major sections of your site. This is not really a tool to increase your ranking, what it does do is increase the likelihood of a search engine spider finding all the pages on your site. It increases the overall interconnectedness of your site and also helps spread around, but not increase, PageRank, as discussed below.

Alternatively you can help the search engines find all your pages by creating a Sitemaps Protocol XML file, more on that can be found here.

Page 30: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

PageRank and your Site's Structure

PageRank (PR) is a major portion of Google's algorithm and similar systems are used by other major search engines. This article will not present an introduction to PageRank, but rather it will discuss how to optimize your site to make the most of what you have. For an introduction to PR please see our article: Link Popularity and PageRank.

There are two sides to taking advantage of link popularity and PageRank. The most obvious avenue that people pursue to increase their scores in these two areas is to seek links from other websites. However, equally as important is the optimizing of your own site to ensure that whatever PR you do get from other sites doesn't go to waste. In fact, due to the nature of how most site's are laid out, almost half of all the weight your home page has could be from other pages on your site. Those other pages on your site, your articles, your subsections, your product pages if you have an ecommerce site, will in many cases get almost all their PR from internal site links. Thus, your internal site structure can make or break your search engine success.

Internal Linking Fundamentals

There are two issues you need to be concerned with in regards to your internal site links. The first, and most evident issue, is that internal site links are conduits for PR to travel to your sub-pages from your index page and vice versa. The second issue is that the benefit that any single page receives from an incoming link is highly dependent on the context of that link - specifically the anchor text (text used within the link itself) for the link. The issue of context has been addressed above in the discussion on building a useful menu so now I will cover the first issue.

Moving Your Weight Around

Do you want to focus more of your weight on your home page, or on your sub-pages? Is there a certain page you want to emphasize? How do you want people to find your site in the search engines? The answers to these questions should shape how you lay out your site's linking structure.

I feel that this topic is best explained via examples and so I will now present a series of examples of different site structures and what they mean for you.

The first example is a pyramid or hierarchical linking structure. The index page links to sub-sections, which then link to their own sub-sections, and so on down the line. Each individual page on the site links to its parent page, and it's children pages, in addition to the index page. You'll find this linking structure on directory sites like DMOZ.org.

Since every page of the site in this example links to the index page most of the PR is focused on the index page and the deeper you go into the linking structure the less PR there is. This type of structure is appropriate for sites that are optimizing their home page for some keywords, or larger sites that are unable to pull off other linking schemes. This same linking scheme can bapplied to sections of a larger site if you want to concentrate PR on a specific page. If you use this linking scheme and put keywords in the text linking

back to your homepage, or the top of your pyramid if you're applying this scheme to only a section of your site, the benefit is enormous.

e

Page 31: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

The other main type of linking is mesh or web linking. In this type of structure every page of the site links to every other page, as shown in the diagram.

This type of linking structure is best in instances where you want to spread PageRank around your site more evening and is most appropriate for sites with a smaller amount of pages (so that all pages can have a link on the menu). This type of structure is also good for a sthat gives a little information on a wide variety of topics. For instance if you had a site about cars with a page or two of information on every major make and model you wouldn't really want to focus PR on your homepage since for a generic car search you don't have much hope of coming up first. But by using a mesh structure you distribute your PR evenly among all of the various car models, instead of focusing it on your homepage, so your sub-pages have a better chance of getting ranked well.

ite

Hybrid linking structures of both of these models are possible as well. You might use an overall hierarchical structure to govern your various categories while using a mesh structure within each category. In general just remember that the more internal links you have the more your PageRank will be spread out.

Cross-linking Two Sites

When dealing with two sites you want to cross promote there are certain linking structures that will give you the most bang for your buck. Two sites that are fully meshed will result in the higher PR site having it's PR lowered and the lower PR site having it's PR raised. So unless you want to sacrifice some of the first site's PR to help the second site you shouldn't fully mesh two sites.

If you're purposely trying to siphon off PR from one site to help another there are better schemes than fully-meshed cross-linking to consider. If the site that needs the PR doesn't link back to the one that is sending it the overall benefit for the site that needs the PR will be larger, as shown in the diagram at left.

By sending it's received PR to it's sub-pages and then back to its homepage Site B is almost effectively doubling the PR that it's homepage gets from the Site A link. This same PR structure can be used to launder PageRank.

For more on cross linking two sites, see this article on the Hub & Spoke method.

Page 32: SEO Guide by Chris Beasley · Search Engine Optimization Fundamentals Search Engine Optimization Philosophy I teach a fundamentalist search engine optimization philosophy. I have

Laundering PageRank

Laundering PageRank refers to the process of redirecting the weight gained from off-topic sites or links without keyword rich anchor text to weight that is gained from such places. It is a very simple process, and if you've followed the advice presented thus far you're already doing it. However I want to explain this concept because I feel that it does a good job of illustrating how PageRank works.

Pretend you're selling widgets but most people don't link to you using the word "widgets." The end result is while you're getting PageRank from your incoming links it isn't helping you a whole lot in the search engines. So what you do is link to your sub-pages and then back to your homepage only this time form the link with the word "widgets." The end result is that context-less weight you're getting from third parties is sent to your sub-pages and then back to your homepage as context-rich weight, which will help significantly in the search engines. So really what you're doing when you link to your homepage with a word other than "Home" or "Index" is you're laundering PageRank. Now not all of the PageRank sent to your sub-pages gets sent back. There is something called the dampening factor which removed some PR from the equation during every "jump," so just like your normal laundry you might find a couple of socks missing after laundering your PageRank. However the benefit still makes it worthwhile if you need to optimize your homepage for a competitive key phrase.

Other Resources

I don't often link to articles written by others as I'd rather link to one written by myself but this article is too useful to pass up:

PageRank Explained

This paper offers some great examples, and the math right along with them, that illustrate how different linking structures affect PageRank.

If you need help with anything you have read in this guide please feel free to ask at our webmaster forums (linkage: http://www.websitepublisher.net/).

Supplemental Reading

1. Search Engine Friendly URLS - how to make your dynamic pages look static. 2. The Three Types of SEO - how SEO changes depending on the type of page. 3. Hub & Spoke - a tried & true method of cross linking multiple websites. 4. White, Gray, Black - explanation of the SEO "hat" jargon, and more on why I like white. 5. Help! My Website is Gone from Google - what to do when things go wrong.

Blog Essays

1. Is SEO Science or Marketing? 2. PageRank: An Essay 3. Doing Link Research 4. The Human Factor: SEO for People

More SEO articles on more niche topics can be found in our main search engine optimization section.