Category Archives: SEO Advice

SEO Advice

Why Flash is Evil

Flash is evil, Flash is bad, Flash is slow, Flash is wrong, Flash sucks

But do you know Why Flash is Evil? Using flash element on your website will probably make it look good, and also enable you to do cool stuff with your design. The bad thing about it is that it also make your website load slower, and your flash content is no readable by search engines. These two will have a negative impact on your search rankings, as faster load time means higher ranking, and more content means higher ranking. To point out how bad flash really are for your website an organization was created, it is the Flash Sucks Organization.

SEO: Flash Is Evil. Five Big Reasons Not to Use Flash

There are many reasons not to use flash, here I have listed a few of them.

Flash is evil

Building Flash-powered websites is wrong. Storing your content in Flash movies is wrong. Implementing site navigation in Flash is wrong.
Then why are there so many Flash sites? They look pretty with all those neat vector graphics, gradients, animations and cool sound effects. Flash is the favorite toy of big designer studios and numerous amateur graphic artists alike. Flash is visually attractive, and in general attractive websites are more successful than the ugly ones (notable exceptions: craigslist.org and plentyoffish.com). But this is not the case of Flash websites. All the benefits of the nice outlook are overridden by the disadvantages in terms of SEO and usability.

Flash requires bandwidth

Despite of the DSL Internet access being available almost everywhere, there are still lots of people surfing the Net via dialup or other limited bandwidth connection. Flash files, especially those using sound effects, embedded movies or bitmap images, can take a while to load.

Disabled back button

Some Flash designers use meta refreshes or other tricks to disable browser’s Back button. As the famous usability expert Jacob Nielsen says, The Back button is the second most important navigation element after hyperlinks. People not able to use Back button will click the third most important navigation element – that X button in the top right. Besides, if you are going to promote a Flash site via PPC, you should know that Google AdWords doesn’t approve pages with disabled back button.

Flash ignores users needs

Whereas the ground rules of marketing emphasize the concentration on the users’ needs, Flash websites ignore them. Take the infamous site intros and splash screens that are as much annoying as the 45 minutes of advertising and previews in cinemas. Or another example: the sound effects – they are can be especially inappropriate and harmful when you are browsing the Net from a cubicle in a quiet office or from home in the late hours.

Problems with third-party Flash developers

Unless you do Flash yourself, you might face some serious troubles with developers. Some of them code their project to prevent them from editing, thus making you to hire them over and over again as you need to do even the smallest modifications. Aaron Wall in his SEOBook (a highly recommended SEO reading) describes a case of a Flash developer who disabled the back button and then asked $4000 from his client to re-enable it, although the problem was caused by his own incompetence.

Search engines do not like Flash

And perhaps the most important: not every search engine is able to crawl and index the content of Flash movies. Even those that can often do it with errors. Even though Google has a limited access to Flash within Websites, Flash-Objects themselves are code-wise not that well-structured like HTML-Documents.
This is a big issue, as Google can’t find the context of Information stored in Flash within the HTML-Code. This is in particular the case of a website fully implemented in Flash as a single file. Search engines just wouldn’t be able to direct visitors to the proper page within that file.
What is Flash really good for? Banners and ads – it provides far more useful features then the traditional gif animation. Online games – remember the ‘Yeti Sports’? Flash technology – the Flash videos – for video blogs.

If misused, Flash breaks our definition of using a Web-Browser

Functions like going back and forth, copy and pasting and bookmarking won’t work within Flash. This shows, that Flash is not only a problem for disabled users, but also in everyday use. The conclusion to all this: Flash is evil!

What is page rank good for anyway

What is page rank good for anyway

This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of SEOmoz, Inc.
This is my first YOUmoz post, and I would greatly appreciate your feedback. I will be actively responding to comments, and I know that we will get a great discussion going. Please comment with any critique, questions, or random thoughts that you may have. If you would rather skip the statistics, feel free to jump ahead to the discussion section.
Introduction
A couple of months ago, SEOmoz explored the relationship between a web page’s PageRank and its position in search results. They concluded:
Google’s PageRank is, indeed, slightly correlated with their rankings (as well as with the rankings of other major search engines). However, other page-level metrics are dramatically better, including link counts from Yahoo and Page Authority.
I was intrigued by the study, and vowed to investigate the metric using my own data set. Because all of my data are at the root domain level, I chose to focus on the homepage PageRank of each domain.
Methods
I averaged three months of data (November, 2009 – January, 2010), collected on the last day of each month for 1,316 root domains. Using Quantcast Media Planner, I selected websites that had chosen to make their traffic data public. To be included, websites had to have an average of at least 100,000 unique US visitors during this time period.
The domains selected for this study do not approximate a random sample of websites. Because of the way in which they were selected, they will bias in favor of sites with many US visitors, and against sites with very few. There may also be differences between Quantified sites with public traffic data, and non-Quantified websites. For example, Quantified domains are probably more likely to include advertising on their pages than sites without the Quantcast script.
PageRank
PageRank (PR) can only take eleven values (0-10). It is an ordinal variable meaning that the difference between PR = 8 and PR = 9 is not the same as the difference between PR = 3 and PR = 4. Like mozRank, it probably exists on a log scale.
The median and mode PageRank among websites in this study were PR = 6, with a minimum of PR = 0, and a maximum of PR = 9. However, only ten websites had PR < 3, and only seven had PR = 9. Results SEOmoz Metrics Using Spearman's correlation coefficient, I compared PageRank to several SEOmoz root domain metrics. Domain mozRank (linearized) was strongly correlated with PR (r = 0.62)*. This correlation was somewhat smaller than the 0.71 that SEOmoz reported in May, 2009. The disparity may be due to differences in methodology; SEOmoz used Pearson's correlation coefficient, and did not linearize mozRank. Additionally, PR data in my study were probably measured over a smaller range of values, potentially weakening the observed dependencies. *All reported correlations are significant at p < .01. MozTrust was also highly correlated with PageRank (r = .62), with Domain Authority somewhat less-so (r = .55). The latter has since undergone some major changes, and this result may not reflect the metric as it exists today. Search Engine Indexing I performed [site:example.com] queries using Google, Yahoo, and Bing APIs to approximate the number of pages indexed by each search engine. Much to my surprise, PageRank shared the strongest correlation with the number of pages indexed by Bing (r = .52), instead of Google (r = .30), or Yahoo (r = .24). My first thought was that Google might not have reported accurate counts, a phenomenon often noted by SEO professionals. However, there is some evidence that may indicate otherwise. If Google's reported indexation numbers are inaccurate, we would expect the metric to have lower correlations with similar metrics. However, indexation numbers reported by Google and Yahoo share a fairly high Pearson's correlation coefficient (r = 0.38). Both appear to share smaller correlations with Bing: 0.34, and 0.26 respectively. Even more interesting, SEOmoz metrics seem to have much stronger correlations with Bing's indexed pages than the numbers reported by Google or Yahoo. If Google is failing to accurately report the size of its index, we might expect that similar queries would also return inaccurate data. However, PageRank shares a high Spearman's correlation coefficient with the number of results returned by a Google [link:example.com] query (r = 0.65). The strength of this relationship appears similar to those between SEOmoz metrics and PR mentioned earlier. PR's correlation with the results of a Yahoo [linkdomain:example.com -site:example.com] query is somewhat smaller (r = 0.53). If the number of pages Google reports having indexed is a relatively poor metric, we would also expect to find more variation between months than other search engines. However, I did not find this to be the case. In fact, Bing had by far the highest average percent change in the number of pages indexed, a whopping 355% increase per month. Google averaged an increase of 61%, and Yahoo an increase of only 2%. While it is still possible that the number of pages on each domain that Google reports to have indexed is inaccurate, I see another potential explanation. Moreso than Yahoo or Google, the number of pages that Bing will index on any given domain is related to the quantity and quality of links to that domain. Perhaps, at least when it comes to indexation, Bing follows more of a traditional PageRank-like algorithm. After all, Google claims that PR is only one of more than 200 signals used for ranking pages. This theory is supported by the results of SEOmoz's comparison of Google's and Bing's ranking factors. Social Media PageRank even shares fairly strong correlations with social media metric such as how many of a domain's pages are saved on Delicious (r = 0.49), how many stories it has on Digg (r = 0.38), and even the number of Tweets linking to one of its pages as measured by Topsy (r = .38). Website Traffic Last, but certainly not least, PageRank predicts website traffic with somewhat surprising strength. As reported by Quantcast, monthly page views, visits, and unique visitors are all significantly correlated with PR. Google's little green bar even correlates with visits per unique visitor (r = 0.18), but not page views per visit. However, putting this in context shows the value of a metric like Domain Authority. Discussion So what exactly does all of this mean, and why is it important? First, despite being a page-level metric, homepage PageRank is actually a fairly good predictor of many important domain-level variables relevant to SEO, social media, and website traffic. For instance, on average, websites with a PR = 7 homepage had 2.6 times as many unique visitors as those with a PR = 6 homepage, which in turn had 1.5 times as many unique visitors as those with a PR = 5 homepage. Second, homepage PageRank is sometimes used as a proxy for a hypothetical “domain PageRank.” While technically inaccurate, this study supports the idea that the PR of a website's homepage provides information about the domain as a whole. While it may be limited to just eleven possible values, PR it is surprisingly good at predicting the relative number of inbound links to a domain reported by Google and Yahoo, as well as the relative number of pages indexed by Bing. The key word here is “relative.” As an ordinal variable, PR cannot be used to predict the actual values of continuous variables. Finally, this study provides evidence that SEOmoz's domain-level metrics may be good (and possibly better than PageRank) predictors of variables important to search, social media, and web analytics. This, as well as all of the results of this study should be interpreted within the context of the included domains (high-traffic, US-centric, and publicly Quantified). I hope you enjoyed reading my post, because I certainly enjoyed writing it. I intend to write many more based on your feedback. If you found this post interesting or valuable, I would greatly appreciate your thumbs up by clicking the icon below. http://www.seomoz.org/blog/what-is-pagerank-good-for-anyway-statistics-galore

Website Load Time

Using site speed in web search ranking

You may have heard that here at Google we’re obsessed with speed, in our products and on the web. As part of that effort, today we’re including a new signal in our search ranking algorithms: site speed. Site speed reflects how quickly a website responds to web requests.

While site speed is a new signal, it doesn’t carry as much weight as the relevance of a page. Currently, fewer than 1% of search queries are affected by the site speed signal in our implementation and the signal for site speed only applies for visitors searching in English on Google.com at this point. We launched this change a few weeks back after rigorous testing. If you haven’t seen much change to your site rankings, then this site speed change possibly did not impact your site.

We encourage you to start looking at your site’s speed (the tools above provide a great starting point) — not only to improve your ranking in search engines, but also to improve everyone’s experience on the Internet

Poor Website Load Time is something that Irritate People, Matter to Search Engines, Plus Can Cost you Money
Remember the late 90s? Those were the times! I remember staying awake late at night, just because the internet connection was slightly better. I was actually happy to download larger files without getting disconnected. People were not so obsessed with speed and loading times, but with the availability in general.

in this day and age, website availability is still a major problem. Most of us experience this on a daily basis. We don’t get all cranky about loading times overnight but we do get more pretentious with each technology leap. Would the modern man wait 20-30 seconds for a page to load completely? Yep, nor would Google.
Your site needs to be in a pretty dark place to get in trouble with search engines, simply due to higher loading times. According to Google, less than 1% of all queries are affected by the site speed signal. In fact, it is far more likely to lose some prospective customers simply because they can get the service elsewhere, without having to wait for a page to load. We tend to take fast loading web resources for granted.
Nearly one-third of the population is online. We live in a world where ~2.27 billion people expect to have a web page completely available in just a few seconds or less. Or do they? Technological advancements are made, but not all can take advantage of them.
“The future is already here — it’s just not very evenly distributed.
This great quote by William Gibson describes pretty well the situation in some geographical regions. If you check your analytics account, I’m pretty sure you will see Africa underperforming in webpage loading times.

Is it worth it to improve your website load time?

Website owners are obsessed with their rankings, quality of traffics, conversions rates, reputation management and the overall well-being of their online presence. It is quite frustrating how website availability is overlooked.
A site might be up and running, but virtually inaccessible to some. Problems do occur and more often than you might think. Unlike the available online resources from the 90s, the majority of websites rely on server side scripting, which on its part relies on properly functioning databases. The point is that a lot more can go wrong with a website nowadays, a lot more than it used to.

I’ve recently came across this question from a Q&A section I found

I was wondering what benefits there are investing the time and money into speeding up an eCommerce site. We are currently averaging 3.4 seconds of load time per page and I know from webmaster tools they hold the mark to be at closer to 1.5 seconds. Is it worth it to get to 1.5 seconds? Any tips for doing this?

That’s a pretty tough one. No one can tell you. People would usually expend the opinion that it would not hurt to do everything possible to make your site run better and they will be right. This is the perfect scenario. We, however, live in a world where optimal decisions are far more valuable than absolute ones. In this case 3.4 seconds is the average loading time. This means that it could be significantly higher in certain hours or geographical locations.

If you have a local site and sell to the local market, then there is no need for you to serve a page in 1.5 seconds for people on the other end of the world. You simply need to perform that well in your local market.
Measuring the speed
Due to some mild problems with loading speeds with a website I look after, I went to find a proper way to measure:

  • Website uptime
  • Webpage response time
  • Web page loading time
  • Availability from various remote locations, worldwide

After trying out a couple of services I decided to go with a combination of two. I figured I need one paid service in order to gain access to more features, detailed reports, sms alerts and a large set of remote locations to test from. I went with websitepulse.com. There are some great alternatives out there. I can’t really call them substitutes because the services are pretty good. I chose to stick with that company because their service has all I currently require. Their homepage looks a little outdated, but that kind of makes me want to think they put their effort and resources in the service itself.
The other way I track the average loading times of my pages is through the Google Analytics site speed report. Last year I had to add one extra line to the GA tracking code to get the feature, but now it is a part of the code. It might be a good idea to update the code at this point. I still use the old code_trackPageLoadTime in my Google Analytics tracking. 10% of the traffic is evaluated and I get the information based on those 10%, which includes only traffic browsers that support the HTML5 Navigation Timing interface or have the Google Toolbar installed.
When you update to the new code to _setSiteSpeedSampleRate(), the sample rate will fall down to 1%. You can always increase it to 10% and that’s about it, at least for now. Here you can learn more about the site speed feature.
I found this feature to be really valuable when explaining loading time issues to the web developers at the company and also presenting the data to the management. The feature seems to keep both parties at bay. It gives all the necessary data to the development team, page by page.

Also the visualizations are pretty great and can make a good addition to your next presentation.
Now, here is why I decided to go with a paid service. While the information from Google Analytics is pretty useful and well organized, it isn’t actually that accurate, mostly due to the limited number of samples. For sites with very high traffic the information might be closer to the actual figures. I don’t want to trash talk a free service I love. It is great for what it says it does.
I did the following test. I’ve sampled my site in January with both Google Analytics and WebSitePulse. I chose the United Kingdom for this test. Here are the results from WebSitePulse.

There is a noticeable change from the 14th of January. This is when we moved the site from our servers to a data center. We had issues with the average loading times for quite some time and I bet this was one of the best decisions in the last couple of years. You can see how the average loading time went down from ~4.5 to ~0.95 overnight. The site had a lot of heavy server side scripts, containing a lot of exceptions and some heavy communication with the data base.
This is the information I have from Google Analytics for January. I’ve expected to see something similar to the chart above. Here I got only 827 samples and in the chart above I have 10 times the amount.

Here you can see a slight improvement, but nothing too dramatic. According to this report I got an improvement of 1.79 seconds. It had no dramatic effect on the bounce rate, but you can see it going down a bit. Then again, it might be something else.
So with 10 times the data I got more accurate results. I’ve tested the site every five minutes, to make sure I have a good enough data set to work with. With Google Analytics all you get is 10%. It is quite useful and pretty good for a free tool, but I won’t close down my paid account just yet. In fact, go out and try any of the remote website monitoring services. Most of them have unlimited 30 days trials, which will give you valuable intel on the current state of your website.
Google Analytics seem to have a great feature in the works and I’m really optimistic about it! I’m curious to know how the Site Speed feature is working out for you.
Tom Critchlow wrote a great coverage on the feature. It might be a good place to read more about Site Speed.

Organic vs Paid Search Results

Organic vs Paid Search Results: Organic Wins 94% of Time

Search engine users overwhelmingly click on organic results on Google and Bing by a margin of 94 percent to 6 percent. That’s according to new research from GroupM UK and Nielsen, published today by eConsultancy, based on a sample of 1.4 billion searches conducted by 28 million UK citizens in June 2011.
On the organic side, the research also broke down brand vs non-brand click-through rates, as well as click-throughs by vertical. On the paid side, the research revealed some demographic data about who is most likely to click on PPC results. Finally, it determined whether Google or Bing delivered more successful searches.

Organic Search Results & Click-Through Rates

Others have previously tried to gauge organic click-through rates (CTRs) for the top 10 results onGoogle and Bing, resulting in varied percentages, but with a recurring and obvious theme: the higher you rank, the more people click on your website; the lower you rank, the less clicks and traffic your site gets. Thus, ranking high on Page 1 is of ultimate value to every website.

Unlike previous studies, however, the GroupM UK and Nielsen study broke down the search queries into branded and non-branded. Overall, users clicked on one of the top three results 68 percent of the time:
• Result 1: 48 percent
• Result 2: 12 percent
• Result 3: 8 percent
• Remainder: 32 percent
On branded searches, the top search result overwhelmingly received the most clicks (which makes sense, considering the search is likely navigational in nature):
• Result 1: 80 percent
• Result 2: 6 percent
• Result 3: 4 percent
• Remainder: 10 percent
This also may give you some insight as to why Google is now showing 7 results, rather than 10, on branded searches. No doubt, Google is seeing the same sort of data on their end, and realizes that 90 percent of users are going to one of the top 3 spots, so in most cases the average user won’t even notice Google has eliminated three search results at the bottom of Page 1.
On non-branded searches, however, the data indicates that searchers are more willing to go beyond the top 3 results (and this data is quite similar to Optify’s findings on average CTR from last year):
• Result 1: 35 percent
• Result 2: 15 percent
• Result 3: 11 percent
• Remainder: 39 percent
This data seems like good news for site that don’t have the luxury of ranking in one of the top three spots. There’s definitely traffic and money in that 39 percent of searches (positions 4-10) to be had if you can get your site to Page 1.
Natural CTRs by vertical were also examined. The highest CTRs on the top three results were on searches for airlines, broadcast media, and auto manufacturers; current events/news, home and garden, and computers and consumer electronics had the lowest CTRs for the first three positions.
Paid Search Results & Click-Through Rates
Who are those 6 percent clicking on PPC ads? Women (53 pecent of them) are more likely to click on paid search ads than men, who click on ads 47 percent of the time.
Age also is a factor. Younger searches are less likely to click on paid ads – 35 percent of ads are clicked on by searchers age 34 or younger. But as age increases, so does the number of people clicking on those ads, as 65 percent of ad clicks come from searchers age 35 and older.
Google vs Bing
Google is crowned the “clear leader” on successful search rates, with a 91 percent successful search rate. Bing’s percentage of successful searches: 76 percent.
Below is the full infographic:
http://searchenginewatch.com/article/2200730/Organic-vs.-Paid-Search-Results-Organic-Wins-94-of-Time

Duplicate Content For Dummies

Duplicate Content For Dummies

Duplicate content is content that appears on the Internet in more than one place (URL). This is a problem because when there are more than one piece of identical content on the Internet, it is difficult for search engines to decide which version is more relevant to a given search query. To provide the best search experience, search engines will rarely show multiple, duplicate pieces of content and thus, are forced to choose which version is most likely to be the original (or best).

hree of the biggest issues with duplicate content include:
1. Search engines don’t know which version(s) to include/exclude from their indices
2. Search engines don’t know whether to direct the link metrics (trust, authority, anchor text, link juice, etc.) to one page, or keep it separated between multiple versions
3. Search engines don’t know which version(s) to rank for query results
When duplicate content is present, site owners suffer rankings and traffic losses and search engines provide less relevant results.
SEO Best Practice
________________________________________
Whenever content on a site can be found at multiple URLs, it should be canonicalized for search engines. This can be accomplished using a 301 redirect to the correct URL, using the rel=canonical or in some cases using the Parameter handling tool in Google Webmaster Central.
301 Redirect
In many cases the best way to combat duplicate content is to set up a 301 redirect from the “”duplicate”” page to the original content page. When multiple pages with the potential to rank well are combined into a single page, they not only no longer compete with one another, but create a stronger relevancy and popularity signal overall. This will positively impact their ability to rank well in the search engines.

Rel=””canonical””
Another option for dealing with duplicate content is to utilize the rel=canonical tag. The rel=canonical passes the same amount of link juice (ranking power) as a 301 redirect, and often takes up much less development time to implement.
The tag is part of the HTML head of a web page. This meta tag isn’t new, but like nofollow, simply uses a new rel parameter. For example: This tag tells Bing and Google that the given page should be treated as though it were a copy of the URL www.example.com/canonical-version-of-page/ and that all of the links and content metrics the engines apply should actually be credited toward the provided URL.
noindex, follow
The meta robots tag with the values “”noindex, follow”” can be implemented on pages that shouldn’t be included in a search engine’s index. This allows the search engine bots to crawl the links on the specified page, but keeps them from including them in their index. This works particularly well with pagination issues.
Parameter Handling in Google Webmaster Tools
Google Webmaster Tools allows you to set the preferred domain of your site and handle various URL parameters differently. The main drawback to these methods is that they only work for Google. Any change you make here will not affect Bing or any other search engines settings.
Set Preferred Domain
This should be set for all sites. It is a simple way to tell Google whether a given site should be shown with or without a www in the search engine result pages.
Additional Methods for Removing Duplicate Content
1. Maintain consistency when linking internally throughout a website. For example, if a webmaster determines that the canonical version of a domain is www.example.com/, then all internal links should go to http://www.example.com/example.html rather than http://example.com/page.html. (notice the absence of www)
2. When syndicating content make sure the syndicating website adds a link back to the original content. See Dealing With Duplicate Content for more information.
3. Minimize similar content. Rather than having one page about raincoats for boys (for example), and another page for raincoats for girls that share 95% of the same content, consider expanding those pages to include distinct, relevant content for each URL. Alternatively, a webmaster could combine the two pages into a single page that is highly relevant for childrens’ raincoats.
4. Remove duplicate content from search engine’s indices by noindexing with meta robots, or through removal via Webmaster Tools (Google and Bing)

http://www.seomoz.org/learn-seo/duplicate-content

Diversify your SEO

Diversify your SEO

http://www.seomoz.org/blog/top-1-seo-tips-for-2013
If we’ve learned anything in 2012, it’s that Google isn’t letting up on low-value tactics. We’ve had the Penguin update, 13 Panda updates (so many that we needed a new naming scheme), and a crackdown on low-qualityExact Match Domains (EMDs), to name just a few. While I can’t tell you Google’s next move, I can tell you one thing with absolute certainty – there is more to come. So, how can you protect what you’ve built in 2013?
I was going to write a long list of suggestions, but I realized that they almost all boiled down to just one idea. I’m not going to toy with you – my top tip for 2013 SEO is this:

If at any point in 2012 you asked “What’s the best [X] for SEO?” (link-building tactic, tag, directory, etc.), you’re already in trouble. Any single-tactic approach is short-term at best. Real companies, real link profiles, and real marketing are rich with variety.
So, what does that mean, practically? I’m going to cheat a bit and split my one tip into five kinds of diversity that I think are critical to your SEO success in the coming years.
1A. Diversify Anchor Text
Let’s start with an easy one. We’ve all known for a while that overly aggressive inbound link anchor text was pushing the envelope, and the Penguin Update definitely reinforced that message. If every link to your site reads “buy best Viagra cheap Viagra today!”, it might as well read “spam spam spammity spam,” especially if it’s in a sentence like:
If you’re looking for the best price on the new iPad and iPad cases, thenbuy best Viagra cheap Viagra today! and get a free bag of Acai berries.
It’s not natural, and you know it. What’s the best way to make your anchor text seem “natural?” Stop obsessing over it. Yes, anchor text is a signal, but any solid link profile is going to naturally use relevant text and appear in the context of relevant text. If you want to tweak the text on some of your high-authority links, go for it, but I wouldn’t break out the spreadsheets in 2013.
1B. Diversify Your Links
Are guest posts the one true answer to all of life’s questions or are they a scourge on our fragile earth? To read the SEO blogosphere in 2012, it’s hard to tell. Any link-building tactic can be low quality, if you abuse it. The problem is that someone reads a tip about how guest posts make good links and then they run off and publish the same slapped-together junk on 15,000 sites. Then they wonder why their rankings dropped.
Nothing screams manual link-building like a profile that’s built with only one tactic, especially if that tactic is too easy. At best, you’re eventually going to be doomed to diminishing returns. So, take a hard look at where your links came from in 2012 and consider trying something new next year. Diversify your profile, and you’ll diversify your risk.
1C. Diversify Traffic Sources
There’s an 800-lb. Gorilla in the room, and we’re all writing more SEO blog posts to avoid talking about it. Most of us are far too dependent on Google for traffic. What would you do if something changed overnight? I know some of you will object – “But ALL my tactics are white-hat and I follow the rules!” Assuming that you understood the rules 100% accurately and really followed them to the letter, what if they changed?
The more I follow the Algorithm, the more I realize that the changing search UI and feature landscape may be even more important than the core algorithm itself. What happens if your competitor suddenly gets site-links, or you’re #8 on a SERP that drops to only 7 results, or everyone gets video snippets and you have no videos, or your niche shifts to paid inclusion and you can’t afford to pay? Even if you’ve followed the rules, your traffic could drop on a moment’s notice.
You need to think beyond Google. I know it’s tough, and it’s going to take time and money, but if you’re dependent on Google for your livelihood, then your livelihood is at serious risk.
1D. Diversify Your Marketing
There’s been a very positive trend this year toward thinking about marketing much more broadly – not as a tactic to trick people into liking you, but as the natural extension of building a better mousetrap. I think this is at the heart of RCS (not to put words in Wil’s mouth) – if you do something amazing and you believe in it, everything you do is marketing. If you build crap and you know it’s crap, then marketing is sleight of hand that you hope to pull on the unsuspecting. You might score twenty bucks by stealing my wallet, but you’re not going to gain a customer for life.
Stop taking shortcuts and make a real resolution in 2013 to think hard about what you do and why it has value. If you understand your value proposition, content and marketing naturally flow out of that. Talk to people outside of the SEO and marketing teams. Find out what your company does that’s unique, exciting, and resonates with customers.
1E. Diversify Your Point Of View
I recently had the pleasure to finally see Michael Dorausch (a chiropractor and well-known figure in the local SEO community) speak. Dr. Mike arrived in Tampa for BlueGlassX and built his presentation from the ground up, using photography to tell stories about the neighborhood and local history. It’s hard to explain in a few sentences, but what amazed me was just how many ideas for unique and original content he was able to find in less than 48 hours, just by having a fresh perspective and passion for the subject. I’d like to say I was inspired by the presentation, but to be totally honest, I think the emotion was embarrassment. I was embarrassed that he was able to generate so many ideas so quickly, just by coming at the problem with the right attitude.
In 2013, if you tell me your industry is “”boring,”” be warned – I’m going to smack you. If you’re bored by what you do, how do you think your prospects and customers will feel? Step out – have someone give you a tour of your office like you’ve never been there. Visit your home city like you’re a tourist coming there for the first time. Get five regular people to walk through your website and try to buy something (if you don’t have five normal friends, use a service like UserTesting.com). The New Year is the perfect time for a fresh perspective.
1F. Happy Birthday, Erica!
Ok, this has nothing to do with the post, but today is Erica McGillivray’s birthday. If you don’t know Erica, she’s our Community Attaché here at SEOmoz. So, diversify your communications today and wish her a happy birthday.