Category Archives: On Page Optimization

On Page Search Engine Optimization

Website Navigation Mistakes

Website Navigation Mistakes – Two Big Mistakes of Website Navigation

There are two big mistakes that get made when it comes to website navigation. The first is that websites use JavaScript and Flash menus. The second is that websites use ‘deep’ navigation structure. Spiders have trouble crawling JavaScript and Flash links. Some can do it, but the vast majority will instead choose to ignore them all together. The best alternative is to opt for XHTML and CSS menus. If you have to use images for your menus, then make sure you optimize the file names and make proper use of the alt and title attributes.

Deep Navigational Structures of Website Navigation

The other major problem is deep navigational structures. This refers to how many clicks it takes to get to your website’s deepest content. If you have to go through too many clicks, then there’s a chance that search engines will ignore and eventually stop indexing your content as frequently. The best practice to fix this is to make your website’s structure shallow. According to some SEO experts, it should only take a maximum of three to four clicks to get from your website’s homepage to your deepest content. An added bonus of a shallow structure is that it increases your website’s user friendliness, which can decrease your website’s bounce rate and increase your Search Engine rankings.

The Website Navigation You Should Use

To be effective, the website navigation system needs:

  • To be consistent throughout the website.The website visitors will learn, through repetition, how to get around the website.
  • The main navigation links kept together.This makes it easier for the visitor to get to the main areas of the website.
  • Reduced clutter by grouping links into sections.If the list of website navigation links are grouped into sections and each section has only 5-7 links, this will make it easier to read the navigation scheme.
  • Minimal clicking to get to where the visitor wants to get to.If the number of clicks to the web page the visitor wishes to visit is minimal, this leads to a better experience.Some visitors can become confused or impatient when clicking a bunch of links to get to where they want to be. In large websites, this can be difficult to reduce. Using breadcrumbs is one way to help the visitor see where they are within the website and the path back up the navigation path they took.

By spending some extra time creating the website navigation system at the planning stage of the website will effect the overall design of the web page layout and help develop the overall plan for the website. This is something we highly recommend to avoid website navigation mistakes.

Panda and Penguin Influence on Onsite SEO

Panda and Penguin Influence on Onsite SEO

Nowadays, organic rankings on Google are getting harder and harder to manipulate. The components of great SEO are still the same: onsite and offsite optimization.

Onsite search engine optimization includes:

  • a properly structured site
  • unique and fresh content
  • proper on page title, meta and H tags, etc.

Google has zeroed in on duplicate content

Google has zeroed in on duplicate content and now shows only the authority sites where I have recently published in the search results and ignore numerous syndications and reposting on my own blog. On the other hand It looks like Google is showing my own blog posts more prominently, sending more visitors for the search terms I have used.

The Panda Reality

The reality is that after the Penguin update and Google’s 500 changes a year and more, search engine optimization doesn’t make sense anymore. It’s more a gamble than a business. With SEO in 2012, search engines are pretty much 100% in control of your destiny. They change algorithms like underwear.

Why use Alt tags on Images

Why use Alt tags on Images – Make your Website more Accessible

One of the simplest ways to make your website more accessible is to use an alt attribute in your image tags. It’s amazing to me how many people forget to use this simple attribute. In fact, now, if you want to write valid XHTML, the alt attribute is required for the img tag. And yet people still don’t do it.

What is the ALT tag or Attribute?

The alt attribute is an attribute of the img tag and is meant to be an alternative for non-visual browsers when they come across images. This means, that the text is meant to be used when the image is not visible on the page. Instead, what is displayed (or read) is the alternative text. Many browsers also display the alt text when the customer rests their mouse on the image. This means that the alt tag text should be clear and easy to read and not create a huge popup nightmare for any reader pausing their mouse on your page.

Browser Support for the Alt tag

The alt attribute is supported in all major browsers.

The Alt tag – Definition and Usage

The required alt attribute specifies an alternate text for an image, if the image cannot be displayed. The alt attribute provides alternative information for an image if a user for some reason cannot view it (because of slow connection, an error in the src attribute, or if the user uses a screen reader).

Note: Internet Explorer (prior version 9) displays the value of the alt attribute as a tool-tip, when mousing over the image. This is NOT the correct behavior, according to the HTML specification.

The Title Tag

The Title tag is supported in all major browsers

Definition and Usage
The Title tag is required in all HTML documents and it defines the title of the document.
The Title element:
• defines a title in the browser toolbar
• provides a title for the page when it is added to favorites
• displays a title for the page in search-engine results
Tips and Notes
Note: You can NOT have more than one Title element in an HTML document.
Tip: If you omit the Title tag, the document will not validate as HTML.
What Is a Title Tag?

The title tag has been – and probably will always be – one of the most important factors in achieving high search engine rankings.

In fact, fixing just the title tags of your pages can often generate quick and appreciable differences to your rankings. And because the words in the title tag are what appear in the clickable link on the search engine results page (SERP), changing them may result in more clickthroughs.

Search Engines and Title Tags

Title tags are definitely one of the “”big three”” as far as the algorithmic weight given to them by search engines; they are equally as important as your visible text copy and the links pointing to your pages – perhaps even more so. Yet, even though this has been common knowledge among SEO professionals for at least 10 years, it is often overlooked by webmasters and others attempting to optimize their websites for targeted search engine traffic.

Do Company Names Belong in the Title Tag?

The answer is a resounding YES! I’ve found that it’s fine to place your company name in the title, and (gasp!) even to place it at the beginning of the tag! In fact, if your company is already a well-known brand, I’d say it’s essential. Even if you’re not a well-known brand yet, chances are you’d like to be, right? The title tag gives you a great opportunity to further this cause.

This doesn’t mean that you should put *just* your company name in the title tag. Even the best-known brands will benefit from a few good descriptive phrases added, because they will enhance your brand as well as your search engine traffic. The people who already know your company and seek it out by name will be able to find you in the engines, and so will those who have never heard of you but seek the products or services you sell.

Title Tags Should Contain Specific Keyword Phrases

For example, if your company is “”Johnson and Smith Inc.,”” a tax accounting firm in Texas, you would want your company’s site to appear in the search engine results for searches on phrases such as “”Texas tax accountants”” and “”CPAs in Texas.”” (Be sure to do your keyword research to find the best phrases!) If you prefer to work with people only in the Dallas area, you’d need to be even more specific by adding geographical modifiers to your title tags, such as “”Dallas tax accountants.””

Using our Dallas accountant example, you might create a title tag like this one:

Johnson and Smith Tax Accountants in Dallas

or you might try:

Johnson and Smith – Dallas CPAs

However, there’s more than enough space in the title tag to include both of these important keyword phrases. I find that using 10 to 12 words in my title tags works great.

One way to include two keyphrases would be like this:

Johnson and Smith – Dallas Tax Accountants – CPAs in Dallas, TX

I’ve always liked the method of separating phrases with a hyphen; however, in today’s competitive marketplace, how your listing appears in the SERPs is a crucial aspect of your SEO campaign. After all, if you have high search engine rankings but your targeted buyers aren’t clicking through, it won’t do you much good.

The idea is to write compelling titles as opposed to simply factual ones, when you can. But it also depends on the page, the type of business, the targeted keyword phrases, and many other factors. There’s nothing wrong with the title tag in my above example. If you were looking for a tax accountant in Dallas and saw that listing at Google, you’d probably click it. (Note: Don’t worry if some of your visible title tag info gets cut off when the search engines display your page’s info; they are still indexing all the words contained within it.)

Still, you could make it a readable sentence like this:

Johnson and Smith are Tax Accountants and CPAs in Dallas, TX

I’m not as thrilled with that one. I had to remove the exact phrase “”Dallas Tax Accountants”” because it wouldn’t read as well if it said:

Johnson and Smith are Dallas Tax Accountants and CPAs in Dallas, TX

It sounds redundant that way, as if it were written only for the search engines.

In the end, it’s really a personal preference.

Don’t make yourself crazy trying to create the perfect title tag, because there’s just no such thing. Most likely, either of my examples would work fine. The best thing to do is to test different ones and see which bring the most traffic to your website. You might very well find that the second version doesn’t rank as well, but gets clicked on more, effectively making up the difference.

Use Your Visible Text Copy as Your Guide

I prefer to create my title tags *after* the copy on the page has been written and optimized. I need to see how the copywriter integrated the keyword phrases into the content to know where to begin. If you’ve done a good job with your writing (or better yet, hired a professional SEO copywriter), you should find all the information you need right there on your page. Simply choose the most relevant keyword phrases that the copy was based on, and write a compelling title tag accordingly. If you can’t seem to get a handle on the most important phrases for any given page, you probably need to rewrite the page content.

I recommend that you *don’t* use an exact sentence pulled from your copy as your title tag. And don’t use the exact wording that’s in your top headline. It’s much better to have a unique sentence or a compelling string of words in your title tag.

You’ll want to watch out for certain website content management systems (CMS) and blog software that automatically generate the title tag from information you provided elsewhere. Some, in fact, default to the same exact title tag on every page, which is the best way to kill your search engine leads! The good news is that most of today’s CMS’s and blog software have workarounds so that you can customize your title tags fairly easily. If yours doesn’t, or your developer claims they can’t do this, then you’ll want to find a new developer or CMS as soon as possible!

Sitemaps for Dummies

Sitemaps for Dummies

Use a Site Map for Better SEO Results

The purpose of a site map is to spell out your Websites central content themes and to show both
search engine spiders and your visitors where to find information on your site. Traditional site maps are static HTML files that outline the first and second level structure of a Web site. The original purpose of a site map was to enable users to easily find items on the Web site. Over time, they also became useful as a shortcut method to help search engines find and index all the parts of a site.
Today, you should have an XML site map, which effectively provides an easy-to-read link dump for the spiders to index. Although certain Web browsers can display an XML site map for users to read as well, you should offer both kinds of site maps (HTML and XML) if you want to be sure to cover both the search engines and your users.
A site map displays the inner framework and organization of your site’s content to the search engines. Your site map reflects the way visitors intuitively work through your site. Years ago, site maps existed only as a boring series of links in list form. Today, they are thought of as an extension of your site. You should use your site map as a tool to lead your visitor and the search engines to more content. Create details for each section and subsection through descriptive text placed under the site map link. This helps your visitors understand and navigate through your site and also gives you more food for the search engines.

A good site map does the following:

  • Shows a quick, easy-to-follow overview of your site
  • Provides a pathway for the search engine robots to follow
  • Provides text links to every page of your site
  • Quickly shows visitors how to get where they need to go
  • Utilizes important keyword phrases

Site maps are very important for two main reasons. First, your site map provides food for the search engine spiders that crawl your site. The site map gives the spider links to all the major pages of your site, allowing every page included on your site map to be indexed by the spider. This is a very good thing! Having all of your major pages included in the search engine database makes your site more likely to come up in the search engine results when a user performs a query. Your site map pushes the search engine toward the individual pages of your site instead of making them hunt around for links. A well-planned site map can ensure your Web site is fully indexed by search engines.

Here are some site map dos and don’ts

  • Your site map should be linked from your home page. Linking it this way gives the search engines an easy way to find it and then follow it all the way through the site. If it’s linked from other pages, the spider might find a dead end along the way and just quit
  • Small sites can place every page on their site map, but larger sites should not. You do not want the search engines to see a never-ending list of links and assume you are a link farm. (More than 99 links on a page looks suspicious to a search engine.)
  • Most SEO experts believe you should have no more than 25 to 40 links on your site map. This also makes it easier to read for your human visitors. Remember, your site map is there to assist your visitors, not confuse them
  • The anchor text (words that can be clicked) of each link should contain a keyword whenever possible and should link to the appropriate page
  • When you create a site map, go back and make sure that all of your links are correct
  • All the pages shown on your site map should also contain a link back to the site map

Just as you can’t leave your Web site to fend for itself, the same applies to your site map. When your site changes, make sure your site map is updated to reflect that. What good are directions to a place that’s been torn down? Keeping your site map current helps make you a visitor and search engine favorite.

How to Construct an XML Site Map for Better SEO Results

A great way to make your Web site more user friendly to search engine spiders is to add an XML sitemap. Your XML site map should be constructed according to the current Sitemap Protocol format (which is regulated by Sitemap Protocol allows you to tell search engines about the URLs on your Web sites that should be crawled.

An XML site map is a document that uses the Sitemap Protocol and contains a list of the URLs for a site. The Protocol was written by the major search engines (Google, Yahoo!, and Live Search) to be highly scalable so that it can accommodate sites of any size. It also enables Webmasters to include additional information about each URL (when it was last updated, how often it changes, and how important it is in relation to other URLs in the site) so that search engines can more intelligently crawl the site. Note that even though its name is similar to the traditional HTML site map, an XML site map is a totally different kind of document, and the two are not interchangeable. You shouldn’t rely on an XML site map alone for your site.

XML site maps define for the spider the importance and priority of the site, better enabling the search engine to index the entire site and to quickly re-index any site changes, site expansions, or site reductions. This XML format offers excellent site indexing and spider access. Additionally, many site-mapping tools can diagnose your XML site map, informing you of duplicate content, broken links, and areas that the spider can’t access. has a tool that constructs an XML file for you, and is a great place to start. Google adheres to Sitemap Protocol 0.9 as dictated by Site maps created for Google using Sitemap Protocol 0.9 are therefore compatible with other search engines that adopt the standards of

A normal version of the XML code looks something like this:
monthly 0.8

The below table shows both the required and optional tags in XML site maps.
Site Map Tags in XML
Tag Required or Optional Explanation
Required Encapsulates the file and references the current protocol standard.
Required Parent tag for each URL entry. The remaining tags are children of this tag.
Required URL of the page. This URL must begin with the protocol (such as http) and end with a trailing slash, if your Web server requires it. This value must be less than 2048 characters.
Optional The date of last modification of the file. This date should be in W3C Datetime format. This format allows you to omit the time portion, if desired, and use the YYYY-MM-DD format.
Optional How frequently the page is likely to change. This value provides general information to search engines and may not correlate exactly to how often they crawl the page. Optional The priority of this URL relative to other URLs on your site. Valid values range from 0.0 to 1.0. This value has no effect on your pages compared to pages on other sites and only lets the search engines know which of your pages you deem most important so that they can order the crawl of your pages in the way you prefer. The default priority of a page is 0.5. You should set your landingpages at a higher priority and non-landing pages at a lower one.

The XML site map also must

  • Begin with an opening urlset tag and end with a closing urlset tag
  • Include a url entry for each URL as a parent XML tag
  • Include a loc child entry for each url parent tag

Meta Tags

Meta Tags are Still Going Strong

Meta tags are the first topic of most SEO training, for better and worse. Just add meta tags and your website will magically rise to the top, right? Wrong. So, why do you need them? The short answer is: They help tell search engines and users what your website is all about. Also, Search Engines like Google, Bing and Yahoo show meta tag information in their search results.

The two most familiar is the keyword and description meta tags, but it is a lot more to it than that. But let’s start with these two. Some experts like to pretend that the META description and the META keywords don’t do anything to your ranking. They miss out on several points. Not only does Google shows your description tag but by adding a better and more relevant text, more people will click on the link and that provides you with a better CTR (click through rate) which has a huge influence on your ranking!
Meta Name Description
You use this tag to explain to your visitor and search engine what they can expect to see on your site. Almost all the major search engines like Yahoo, Google, and Bing will use your description tag.
How to use it
If you have some knowledge of codes and HTML you’ll be able to type the following line ( the meta description tag ) in the head-part of the source of your web-page.

Bogus words like “”qiskodslajdmnkd””, misspelled words, your competitors name in the keywords are indeed useless. But if you create your meta tags with care, search-engines will use your tags one-on-one. A better html page title and an optimization of your meta description tag gives a higher CTR.
It’s very important that you use the keywords you want your site to be found under, in your meta description tag. The more frequently a search engine spider finds a certain word on your website and in your meta tags, it values that word as important. Your site will be positioned higher in the search index.
Make sure to put some extra effort in it, it will attract extra visitors to your web-page. Create relevant and different meta tags for each and every page.
Meta Name Keywords
Most SEO experts will probably tell you that this tag is dead, you don’t need it, and search engines don’t care, and you shouldn’t waste your time on it. This tag was at the earliest stage of the Interent a critical element to any website that wanted to rank high. Then the spammers, came took advantage of it and filled it up 100s of keywords, to improve their rankings. Due to this the major search engine had to stop emphases on this meta tag. Today some SEO experts tell you to get rd The best practice is to keep it on you
It’s very important that you add the keywords you want your site to be found under in your KEYWORD tag. If a search engine spider finds the same words on your website and in your meta tags, these words will be ranked higher in the search index. Don’t add too many words, the most search engines will only index the first 20 words. Make sure that you put the 10 most important keywords first.
The truth it keep it relevant with the rest of your page, and it will place Yes, I put this on the indifferent because while no good SEO is going to recommend spending time on this tag, there is the small possibility it could help you somewhere. So please leave it out if you’re building a site, but if it’s automated there is no reason to take it out
Meta Content Type (Popularly called The Charset Tag)
This tag is necessary to declare your character set for the page and should be present on every page. Leaving this out could impact how your page renders in the browser. A few options are listed below, but your web designer should know what is best for your site.

Finally, all sites must validate charset. In the U.S., that is the UTF-8 tag. Just make sure this is on your page if you’re delivering HTML using English characters.
< meta http-equiv=""Content-Type"" content=""text/html; charset=UTF-8"">
META Language Tag
Required By Bing
Language – The only reason to use this tag is if you are moving internationally and need to declare the main language used on the page. Check out this meta languages resource for a full list of languages you can declare.

Does your website contains pages in different languages? By adding index pages with different language tags you are able to adjust the language to the specific country.
Where should you add this language tag?

You may add these language tags to all of your web-pages, so not only in the first index page. Make sure that on every page relevant meta tags are added. Add keywords and phrases that are relevant and correspond to the text and the language on that specific page. It might be a lot of work to add specific meta tags to each page but you will notice in time that it works!

Robots Meta Tag
The robots tag is still one of the most important tags. Not so much for the proper implementation, but the improper.
The robots meta tag lets you specify that a particular page should not be indexed by a search engine or if you do or do not want links on the page followed.
Robots Meta Tag
The robots tag is still one of the most important tags. Not so much for the proper implementation, but the improper.
The robots meta tag lets you specify that a particular page should not be indexed by a search engine or if you do or do not want links on the page followed.
Believe it or not, it is still common for a site to be deindexed because someone accidentally added a noindex tag to the entire site. Understanding this tag is vitally important.
Here are the four implementations of the Robots Meta Tag and what they mean.
• This means: “Do not Index this page. Do not follow the links on the page.” Your page will drop OUT of the search index AND your links to other pages will not be followed. This will break the link path on your site from this page to other pages.

This tag is most often used when a site is in development. A developer will noindex/nofollow the pages of the site to keep them from being picked up by the search engines, then forget to remove the tag. When launching your new website, do not trust it has been removed. DOUBLE CHECK!
• This means: “Do Index this page. Do not follow the links on the page.” Your page WILL be in the index AND your links to other pages will not be followed. This will break the link path on your site from this page to other pages.
• This means: “Do not Index this page. Do follow the links on the page.” Your page will drop OUT of the index BUT your links to other pages will be followed. This will NOT break the link path on your site from this page to other pages.
• This means: “Do Index this page. Do follow the links on the page.” This means your page WILL be in the index AND your links to other pages will be followed. This will NOT break the link path on your site from this page to other pages.
NOTE: The robots tag may be ignored by less scrupulous spiders.
Other Meta Tags
There are many other meta tags, but none are really considered useful nowadays. Many of the tags that we used did things like:
• Told the spider when to come back
< meta name=""revisit-after"" content=""30 days"">
• Told the browser the distribution
< meta name=""distribution"" content=""web"">
• Told the page to refresh
< meta http-equiv=""refresh"" content=""30"">
• Told the page to redirect/refresh
< meta http-equiv=""refresh"" content=""x_seconds; url="">
We don’t use these anymore, either because there are better ways (such as schema tagging or server side methods) or because the engines they used to work on are no longer in existence or Google has explicitly told us they are not great ideas (such as redirects at the page level).
NOTE: Schema tagging and rich data snippets are single-handedly the most important (and somewhat quietly announced) change to how your site interacts with the search engines and the search spiders. Learn it. Know it. Implement it.


While meta tags aren’t the magical solution that you may have heard, they still play an important role in helping your website get found in search engines. Enjoy your metas!

Keywords in the URL

What is an URL

An URL, or Uniform Resource Locator, is a subset of the Uniform Resource Identifier (URI) that specifies where an identified resource is available and the mechanism for retrieving it.URLs describe a site and page to visitors and search engines. Thus, keeping them relevant, compelling and accurate are key to ranking well.

Keywords in the URL – SEO Best Practice

The URL of a web document should ideally be as descriptive and brief as possible. If, for example, a site’s structure has several levels of files and navigation, the URL should reflect this with folders and subfolders. Individual pages’ URLs should also be descriptive without being overly lengthy, so that a visitor who sees only the URL could have a good idea of what to expect on the page.

Keywords in the URL Example: Comparison of URLs for a Canon Powershot SD400 Camera

1. –
2. –
3. –

With both example 1 and 2, a user has virtually no idea what the URL might point to. With example 3 it is easy to surmise that a review of a Canon SD400 is the likely topic of the page.

Keep URLs limited to as few dynamic parameters as possible

In addition to the issues of brevity and clarity, it’s also important to keep URLs limited to as few dynamic parameters as possible. A dynamic parameter is a part of the URL that provides data to a database so the proper records can be retrieved, i.e. n=3031001, v=glance, categoryid=145, etc.
Note that in both the Amazon and Canon URLs, the dynamic parameters number 3 or more. In an ideal site, there should never be more than two. Search engineer representatives have confirmed on numerous occasions that URLs with more than 2 dynamic parameters may not be spidered unless they are perceived as significantly important (i.e. have many, many links pointing to them).

The Three Primary Benefits of keywords in the URL

URLs have 3 primary benefits:

1. Semantics

A well crafted URL should semantically make sense. The DPReview URL above is a good example of a semantically accurate URL. (This of course assumes that the page actually is about what is described) It is easy to tell the subject of the URL just by examining its URL. This is helpful to both humans and search engines.

2. Relevancy

The other benefit of having a semantically correct URL, is that webmasters are more likely to get search engine referred traffic due to the keywords in the URL. These, like title tags, are used for determining relevancy and computing rankings.

3. Links

Well written URLs have the additional benefit of serving as their own anchor text when copied and pasted as links in forums, blogs, or other online venues. In the DPReview example, a search engine might see the URL and give ranking credit to the page for terms in the URL like dpreview, reviews, canon, sd, 400.

Placing keywords in URL as part of your SEO campaign

Choosing the most relevant keywords in your URLs can be an important part of your Search Engine Optimisation strategy. This is because search engine crawlers will read the keywords that you have included in the URL to decide if the content is relevant to your customers’ search terms. The more words that you can include in the URL address that match the ones that your customers are typing into search engines the better your results will be. We have included tips in the SEO Junkies blog before on the importance of including keywords in your URLs. However, it is also important to recognise the value of the order and position of these keywords within the URL.

Help your SEO strategy by strategically positioning keywords in the URL

Positioning keywords further forward in the URL will make a difference to your search engine optimisation campaign’s ranking results. The location of the keywords in the URL can influence the performance of your site in the search engine rankings. For example, would perform better than This is because the search engine robots put more importance on words that appear earlier in the URL text. So not only is it important to include your keywords in your URL, but also to ensure that you include them as early as possible in the URL address so that the search engines give them maximum importance when assessing your site’s relevancy to your customer’s search terms.

Keyword positioning matters to SEO

This prioritising of the keywords that appear in your URLs is similar to the practice of making sure that your keywords appear as close to the top of your written content as possible. Search engines will prioritise the relevancy of words appearing in the earlier part of the text. This principle also applies to URL addresses and this is an example of a strategy that will make your SEO campaign considerably more effective.

Content is King

Content is King

This is the kind of all-or-nothing catastrophizing that does confuse a lot of folks. Somebody says, “”Getting thousands of links from low-quality ‘directories’ is probably not going to help you these days,”” and all of a sudden the Chicken Littles come streaming out of the woodwork declaring “”All links are worthless!””

“”Content is king””
As Qwerty said, nobody here is saying you shouldn’t promote your website. And as Jill said, nobody here is saying “”content is king”” (at least, not in the sense that all you have to do is create awesome content and everything else will fall into your lap automatically).

Good content has always been an important component of real SEO as we advocate here. Crappy, keyword-stuffed, “”spun”” articles that were written strictly to attract search engine spiders have never been a good idea in our book. Nothing there has changed in the advice we’ve been giving since I joined this forum in 2003.

“”Backlinking is the devil””
Likewise, we’ve always been in favor of getting good links from high-quality pages. Nothing lately has changed that, either. There’s nothing wrong with back links. There’s not even anything necessarily wrong with reciprocal links, as long as there’s a good reason for the two pages to link to each other.

It should be noted, though, getting higher rankings is not a good reason to trade links. Your page offers good information that my visitors would find useful is a good reason to link to another page. I’d argue it’s the only reason you should link to a page. If it happens that the site owner at the other end thinks the same thing about your page and chooses to link back to it, that’s not a problem. Never has been and I doubt it ever will be.

“”Don’t do any SEO””
Puh-leeze. None of the moderators or administrators here has ever said that. What they have said is that you don’t want to do crappy, obvious SEO. You know — the home page that declares: “”We have a great Denver area web design company that offers excellent Denver web design services to customers all over Denver who are seeking locally-created Denver web designs.”” With a zillion backlinks from a zillion crappy “”directories”” and comment spam on a zillion splogs and poorly moderated forums, all with the anchor text “”Denver web design.””

That would be a footprint as big as the iceberg that sank Titanic.

But, you know, that was never SEO in the first place. At least not as we’ve ever defined SEO here. The “”O”” of “”SEO”” stands for “”optimization.”” That is, making something the best it can be. And that kind of crap, which has been passing for SEO in some circles, was never about making things evenmarginally better, much less the best they can be.

The best SEO doesn’t look like SEO. That’s what Chrishirst was talking about. There is nothing wrong with optimizing a page — in fact, that’s what youshould do — but when it becomes obvious not only that the page has been “”optimized”” but exactly what phrase(s) it’s been “”optimized”” for, then the line has been crossed.

“”God forbid you ever buy an exact match domain(!)””
And finally, if an exact match domain makes sense for your business, go for it. But you need to keep in mind that the folks at Google have already said they’re specifically targeting exact match domains, so when you buy it you need to keep your expectations reasonable. You’re not going to shoot to the top of the rankings and stay there just because you have an exact match domain. In fact, in light of what Google has said, you may even be making your job more difficult by using a keyworded-domain rather than going for something brandable.

We wouldn’t be doing a responsible job here if we didn’t point this out to people. Sadly, there are a lot of folks out there who are still operating under the idea that all they need to do is have a keyword in their domain name and they’ll automatically rank well for that keyword. And a lot of them never consider things like: what happens if you change your business model so the keyword in your domain no longer applies? Are you prepared to start all over from scratch, or would it be better for your business to have a brandable domain so you can change your business focus at any time without launching a whole new domain?

So we try to educate them, bring them up to speed, make sure they know what they’re potentially getting themselves in to.

But if you understand all the potential ramifications — not just SEO-related, but business evolution, branding and marketing implications as well — and you’re ready to put in the work it takes to promote it, then by all means buy whatever domain you want.

Reality Check
Paint-by-numbers “”SEO”” doesn’t work as well today as it used to. Does it still work sometimes? Absolutely. Are the folks at Google and Bing and Baidu and elsewhere working to ensure it works less and less well in the future? From what I’ve seen, I believe so. Is there a future in the kind of algo-chasing, rules-based, formulaic “”optimization tactics”” such as have been promoted on some venues (not here) in the past? Not if you’re interested in building solid traffic and conversions that will keep your business humming on into the future. Does that mean people shouldn’t do SEO at all? No, just that they shouldn’t do crappy so-called “”SEO.”” Which is exactly the same thing we’ve been saying here since 2003.

Does your list of statements bear even a passing resemblance to anything the moderators or administrators have said here, ever? Not even close.

Can Your Website Structure affect SEO

Can the Website Structure of Your Site Affect SEO?

Ever since the recent Google Panda updates were released, on-site SEO has become even more important when it comes to staying on top of the SEO game. Small mistakes that could be overlooked by simply doing a bit of off-site optimization have now become critical reasons for rank drops.
The real problem that a number of websites have is in their structure. Their url format, their site navigation and, arguably most importantly, their internal linking structure all leave something to be desired. Today, you’ll learn the best practices for site structure and hopefully be on your way to making your website’s SEO better.


Many websites have URLs that are subpar. They don’t tell search engines anything about the content listed on there, nor do they have a set of keywords in the URL. It doesn’t matter if you have a plain HTML website, a CMS or a blog that runs WordPress. Your pages should each have their purpose in their URL.
There are two important things to note. One is that you should replace spaces with dashes, not underscores. The latter can cause issues with search engines pushing words together for the wrong keywords. The second is that you can go back and fix your URL structures regardless of how established your website is. Just make sure to put a 301 redirect for any page you change in your .htaccess file.

Site Navigation

There are two big mistakes that get made when it comes to website navigation. The first is that websites use JavaScript and Flash menus. The second is that websites use deep navigation structure. Spiders have trouble crawling JavaScript and Flash links. Some can do it, but the vast majority will instead choose to ignore them all together.The best alternative is to opt for XHTML and CSS menus. If you have to use images for your menus, then make sure you optimize the file names and make proper use of the alt and title attributes. The other major problem is deep navigational structures. This refers to how many clicks it takes to get to your website’s deepest content. If you have to go through too many clicks, then there’s a chance that search engines will ignore and eventually stop indexing your content as frequently. The best practice to fix this is to make your website structure shallow. According to some search engine optimization experts, it should only take a maximum of three to four clicks to get from your website’s homepage to your deepest content.

An added bonus of a shallow structure

An added bonus of a shallow structure is that it increases your website’s user friendliness, which can decrease your website’s bounce rate and increase your SEO rankings. Sothink DHTML Menu is one tool that can be used to organize menus and create a shallow structure.

Internal Link Structures

This is one often neglected element that is absolutely vital to on-site SEO success in a world where semantic searches will become more commonplace than they already are. Internal links used keyword-rich places within your text content to link to other relevant pages in your website. This allows search engines to quickly and easily figure out what your other pages are about. They can also be used to decrease page depth and make it easier for users to find related content.
The reason that internal links are so important is because of the increasing value search engines are putting on them. Some SEO experts have gone so far to say that semantic web searches could one day completely eliminate the need for off-site optimization, though this isn’t quite the case just yet.