Steps of SEO – Part 1

September 9, 2008 by · 1 Comment 

Search Engine Optimization may be a confusing and overwhelming subject for many webmasters. Others simply LOVE the idea of SEO and determining how search engines establish their results and how to stay one step ahead of the game. Although SEO’s needs a foundation to start on, hence what I call the steps of SEO.

Step 1: Identifying A Main Topic

This is really the most crucial part of a successful SEO campaign. You must determine a main ‘general’ topic for your website. This way you can set up a latent semantic system on your website. What I mean by that is this, let’s pretend that your ‘general’ website topic is fishing. Your main page will be a landing page about fishing with separate pages for the topics of bass fishing, deep sea fishing, and fly fishing. Within that you can have an even more broken down structure. This way when one page begins to rank well for a given keyword, the pagerank and its success can be passed among your architecture layout.

Step 2: Content is King

This idea isn’t revolutionary by any means within the SEO community. Except that it is absolutely crucial if you want to be a dominating figure in search engines. By creating a website that contains well written and thought out content you can begin to become an authoritative figure within the search engines eyes. By applying this within the LSI structure I described above, Google will love you. If you analyze websites that rank well within search engines they typically have two similar variables. First they have an aged domain, and second they have thousands of pages of articles and documents.

For some reason websites begin to ‘pop’ on rankings when they achieve the 200-300 page mark. Keep in mind that these are quality and content driven pages. Not spammy or duplicated content pages! So get out there and get started writing (and or purchasing) content for your website.

Step 3: Identifying your Link Building/Site Architecture

As I’ve described in the previous steps, content and a site architecture is key to a websites success. By setting up your website with a hierarchy in mind you can target hundreds of topics while each subject remaining relevant. A ‘silo’d’ structure would look something like this.

  1. Fishing
  • Deep Sea Fishing
    • Locations
      • Puerto Vallarta
    • Techniques
      • Educational Material

Logical right? That’s the point, to make it easier for the user to navigate as well as set up a well thought out website. This not only benefits the reader but it will benefit your rankings within the search engines. By establishing a system for pagerank to flow within your site you will allow for the ‘authority’ to flow within your articles/content. This means that they will begin to rank well, once a single page on your site ranks well.

This idea reinforces the topical relevance that is so important in today’s SEO campaigns. By separating subjects, and linking to other subjects within your site when available you are allowing yourself to achieve optimal results. The ultimate goal is to start with this foundation in mind! Don’t make it an afterthought (when possible), build your site with LSI and inter-linking in mind. With time you will begin to rank well, and successfully for keywords you had not even thought of.

[tags]foundation of seo, seo foundation, steps of seo, search engine optimization steps[/tags]

The Anchor Text Algorithm Change

September 8, 2008 by · 1 Comment 

I’ve already discussed this a little bit in a previous post, although I’d like to bring it up again due to the significance of this change. After looking at some concrete data and watching the fluctuations of SERP results for many clients we have come to the conclusion that there was an algorithm change in the strength of the anchor text. Typically there were three portions of the relevance algorithm that had a significant impact, first was the anchor text label that was the incoming link, second was the amount of information surrounding the link and its relevance to the subject, and third how long the site(age) of the incoming link.

After analyzing the information, it seems that Google specifically added another layer to their ever changing algorithm in which a website can no longer rank for a keyword that may not be relevant and or important. This may sound confusing but remember my post on the Google bomb that occurred back in the day for ‘failure’ and the link to G.W. Bush’s Presidential Profile page on Whitehouse.gov? Now Google has changed the relevance within the anchor text and its score that it associates with sites.

Simply put it’s another piece of the puzzle, although we have determined that relevance is now calculated differently and you need to fulfill the relevance checklist in order to rank well. Having a website and having some content just won’t do it anymore, you need to make sure that the links incoming are relevant and that your website has the necessary relevant content to ensure that you get the attention of the Google crawl bot. It’s always been important to attempt to receive incoming links from authoritative websites that have a good ‘reputation’. Now I feel that there has been more emphasis placed on it.

It seems that since the implementation of these new changes the idea of a Google bomb has been diffused, anchor texts from any site don’t hold the same weight and that grouped links (think blogrolls) have been quarantined until Google further examines them. Google is transforming into the beast we were all afraid of it seems. Personalized results, new measuring metrics and relevance of incoming links are the new elements in the Google model.

Search Engine Optimization is about adapting, without adapting correctly you will fall behind your competitors. Don’t rely on old information in an ever changing game, you will lose if you don’t stay up to date. It’s necessary to constantly research, identify changes, and complete the necessary changes to stay on top. Here are some ideas on how to stay ahead of the game.

Relevant Links

As I’ve always stated, its not just about the link that your receive but about where it has come from and how Google deems that website. Don’t just go out and blast your website to directories, or literally anywhere you can get your link on. With the transformation of Google’s algorithm towards relevancy and personal results it will be necessary to have links that have topical relevance.

Increase Relevance by Creating It

As stated in a previous post, use press releases. Press releases allow you to create the relevancy needed as well as a backlink for a ‘reputable’ source. I’m not talking about free press release sites, but those that you pay for. We’ve had some truly great results by this method for our customers. Host a contest or give away an ebook. By creating these sort of freebies and giveaways people will create links to your site telling about it. Use that to your advantage.

Onsite Topic Relevancy

This is really a huge step, make sure that your pages are on topic and relevant to your incoming links anchor text. Don’t jump all over the place on your pages, stick to a concrete subject so that relevancy can be easily identified by the search engines. Remember by staying on target and on topic search engines will easily identify the subject and the relevant and important keywords.

SEO Is About Patience

As much as we all hate it, SEO is about patience. There is really no way around it except for the fact that changes don’t occur instantly. Let the search engines do their thing and don’t freak out about drops in rankings. Unless of course you have noticed a drop for a long time! But changes occur quite frequently in the algorithms and as a result rankings will fluctuate. Remember that another ‘authoritive website’ may have removed your link, or for whatever reason is now a 404 page. One action can set off a chain of events with SEO and its not always possible to determine the cause of the fluctuation. So remember that patience is required to identify the problems and correct them.

[tags]algorithm google, google anchor text, anchor text relevance, anchor text change[/tags]

Anchor Text Algorithm Change?

September 3, 2008 by · 2 Comments 

Working for a search engine optimization company allows for me to stay on top of the changes in Google’s algorithm. Recently we noticed a huge transition of websites ranking results and began to analyze why the change was occurring.  Please leave a comment if you have noticed similar changes with your website.

As stated quite a few of the websites that we track have had altered SERP rankings quite recently and we knew that it wasn’t as a result of being banned from the results but simply a change in the algorithm. The commonality between all the sites? Anchor text links for the given keyword. The websites in question were part of a trial SEO activity that relied heavily on ‘anchor text’ incoming links. Especially for competitive search terms.

Anchor Text Google Flaw?

Was anchor text significance a flaw in Google’s algorithm? Well in short, somewhat. Google is king of natural results, what I mean by that is that they want to present the most relevant and natural website results as possible. As such they don’t want variables that will create spam techniques to create results for a given search term. Let’s take a look at the term ‘insurance’ or ‘loans’. Obviously if a website has thousands or millions of incoming links with the term ‘insurance’ it doesn’t mean that the given website is a credible source for that information. It just means that they have a very large budget for purchasing incoming links for that given term. That is obviously completely un-natural and against the general Google philosophy. A normal incoming link would simply have the given company name, or possible the company name and the word loan. Although how many people would simply link the term ‘loan’ to a given website? Not very many.

Staying Ahead of Google

Obviously everyones goal is to stay ahead of the Google algorithm. Changing constantly to mesh with the given algorithm and have outstanding SERPs results. A little trick is changing up the given anchor text that you are achieving for. Don’t have all incoming links with a keyword of ‘loans’, rather change it up where possible. If you are paying for links create your anchor text for your main keyword 60% of the time and change up the remaining 40%. For example 60% of your incoming links could have your main keyword ‘loans’ and the remaining 40% could be distributed with ‘company name loans’ or just your ‘company name’, or the type of loan you can provide such as ‘mortgage loans’ or even the area that you are trying to target. The point is to make things appear natural, and by having over 90% of your incoming links with a single given anchor text you can bet that it wasn’t a natural way of achieving links.

[tags]anchor text link help, anchor text changes algorithm, google algorithm changes anchor text[/tags]

Matt Cutts on Paid Link’s: What does Google Think?

September 2, 2008 by · 1 Comment 

As we all know search engines, Google especially, wants to keep their results as clean and accurate as possible. Bringing the searchers the most relevant and best suited information to them. Over the past year we have watched as paid link websites have been removed from Google’s search engines and websites being removed for partaking in such an activity. So it comes to no surprise that Matt Cutts released a post discussing paid links and how Google approaches such activity.

I absolutely love search engines, and watching how the algorithms have changed over time. I remember when META tags were one of the most important aspects and then how incoming links were analyzed and added to the mix. It was all about incoming links at one time, regardless of where or how they came in. As such directories, and link exchange programs sprang up to help give some the edge over others. I mean just think about everything that was needed to do in the past to rank well in the SERPs. From contextual advertising, to three way link exchanges, banner advertising with DoFollow and other techniques.

Over time we have been wondering what exactly Google was doing. Watching sites get completely removed, then re added to the search results. It’s like watching a huge pest control company trying to eliminate the pests only for another batch to replace those that they already ‘squashed’. Websites like text link ads (TLA) and TNX getting removed and some websites getting punished for partaking in such link buying activities. After the last PR update you should have seen the message boards with people reporting drops of PR 6 to PR 1! After analysis the conclusion came down to selling links on their site, and or purchasing links for their site. So what did Matt Cutts say about it?

Do paid links violate Google’s quality guidelines?

Not necessarily. Cutts says the only paid links he cares about are ones designed to game search engines. He cites an example of a Linux site with a group of sponsored links for casinos, drugs, and gifts. Aside from apparent spamminess, the links are presented in image format, which Matt thinks is to avoid detection.

“I’m sure,” he writes, “some people will happily defend links like these, but in my experience people who search on Google don’t want links like these to affect Google’s search results.”

“Google is not interested in reports on affiliate links or directories, just spammy gaming attempts like the example mentioned.”

Well it appears that Google is only going after the ‘spammy’ websites. Although its tough to say what exactly dictates a spammy website or an overly zealous webmaster looking for backlinks. I’ve seen many websites simply get banned because the webmaster didn’t know better on his incoming link building strategy. Matt also stated that paid link report data is not being used currently (this has to be dated because of the drops in PR from the last update for people) and that nothing is really getting applied to the current algorithm. Maybe it comes down to a human interaction that decides given websites and if they should be removed? It’s tough to say although what about competition sabotage? How does Google tackle ‘blackhat’ techniques of trying to remove your competitor by sabotaging them?

“We’ve always tried very hard to prevent site A from hurting site B. That’s why these reports aren’t being fed directly into algorithms, and are being used as the starting point rather than being used directly. You might also want to review the policy mentioned in my 2005 post (individual links can be discounted and sellers can lose their ability to pass on PageRank/anchortext/etc., which doesn’t allow site A to hurt site B).”

So it appears that the game has begun to change as we know it. I personally believe that it has begun and it is here as of the last PR update. As search engines algorithms continue to become more and more complex with one single variable in mind, content relevance, we are going to see huge transitions. I think that we are beginning to see that now!

[tags]google paid link, paid link google’s idea, what google thinks of paid links, search engine results[/tags]

Press Release SEO Strategy

September 1, 2008 by · 5 Comments 

I work at a SEO company based out of San Diego, we gather a tremendous amount of data from our clients about what works in the world of SEO and what doesn’t. Recently we tried a new technique with one of our most competitive keywords and had some pretty outstanding results. Bouncing around from spots 2 – 4 for the competitive keyword we were wondering what we could do to move up and capture the number one spot. We figured to give a try to the good ol’ Press Release strategy one more time. We decided to try it out in paid Press Release directories and followed a very precise strategy in those press releases. We wanted to make sure that our story was newsworthy as well as to make the press releases as SEO friendly as possible. Here is what we did:

Consider Your Keywords

Remember that we are dealing with a press release here and not just some random article directory. Some keywords that you may be trying to target simply will not work for this type of SEO activity. So when you are considering your keywords make sure that they are press release friendly, as well as a highly searched keyword term. Remember I’m talking about PAID inclusion to PR websites, sure you could try this with free ones but we have some really great results from simply PAID inclusion sites. Thus you want to make sure you dot your I’s and cross your T’s before submitting a paid release.  Do a little background check on your targeted keywords and make sure the search volume exists for it before you go through with this method.

Choosing Your Title

Remember that we took this from a search engine optimization approach and not one for public recognition or brand building awareness. We wanted to have the most emphasis and ‘boost’ in our SERPs so we made the title short and sweet with all the emphasis on our given keyword that we are trying to optimize for. I suggest the same approach for all SEO purposes, the chances that someone will pick up your press release and contact you for media purposes is practically no existent. Make sure to target your keyword in your title

The Body

Again let’s get as much SEO juice out of this press release as possible, so we are going to need to sprinkle our keyword around several times throughout the body. Remember we are not keyword stuffing as it won’t benefit anyone and your release probably wouldn’t be published. Check with your potential press release directory for setting up your press release. Each directory has different rules for creating a release so make sure to follow them closely so that your release isn’t denied. If possible add a link with your keyword going to your page, this will add additional benefit to your keyword and will help with SEO purposes. Remember though that your body needs to flow and sound legitimate. People do end up reading them sometimes, so make sure not to ruin your reputation by putting out some chopped together release. You can mix both keyword inclusion and readability!

Distributing Your Press Release

There are hundreds of press release directories that you can submit your release to. Remember that there are services on the internet where you simply give them your press release and they will submit to so many directories. I personally suggest not to overdo it with those types of services and to submit your own press releases, with different variations to bypass any duplicate penalties.

What to look for in a directory? First start with pagerank of the domain URL. A good press release directory will have a pagerank of 5 or more and will generally be around $50 for inclusion. We tried the free directories with one competitive keyword and it didn’t result in much SERP change. Although when we included our press release for a different domain and as competitive keyword we jumped to the top! You get what you pay for I guess? So analyze each directory before submitting your release

Check the Rankings

After a few weeks be analyzing your results from that given submission. You will notice a small increase of visitors from the press release. Are those converting? Most importantly how are your SERPs for your given keyword that you were trying to target? Things like this are important to view and consider before you submit to additional directories!

[tags]press release submission, press release seo, press release search engine optimization, press release SEO help, press release for SEO[/tags]

Overcoming Being De-Indexed by Google

August 29, 2008 by · Leave a Comment 

I get this question alot, especially from new webmasters venturing into the unknown. In fact I just recently had a very good friend of mine fall victim to having his well established website become de-indexed! So we began backtracking and trying to figure out where he went wrong. You want to know what his problem was? His webhosting company! His site was down way too often for his own good and he eventually got the index boot because of it. Here are some of the things that we checked out before coming to that conclusion.

  1. Robots.txt – It’s important to verify the contents of your Robots.txt, especially if you have recently made any changes. Webcrawlers pull up the robots.txt file before crawling your site to understand how you want it indexed. Make sure that you don’t have any code to block the website from being indexed.
  2. .htaccess – This is another good place to start. If you happened to recently change your contents of your htaccess make sure that you didn’t mess anything up that could cause you getting de-indexed.
  3. Downtime – As stated above this was the reason that my friends website was de-indexed. His host was unreliable (yet cheap), but whats the point of cheap if you have alot of downtime and eventually get your website de-indexed? Pony up the difference for a quality host.
  4. No New Content – Although extreme this can happen if you do not update your website. Make sure that you update things atleast monthly and keep your website fresh!

Did Google de-index your site? Wanna get re-indexed and establish what caused the problem? Great here is what you need to do.

  • Register Site With Google Webmaster Tools – If you haven’t done this yet then you absolutely need to check out Google’s Webmaster Tools. It really gives you the 411 on what is going on with your website, why Google likes or dislikes it, and how you can improve it. Once within the Google webmaster tools its time to begin playing around and getting used to its back office. This is a very nice tool to gauge your websites health and if its ‘clean’ in Google’s eyes.
  • Build Backlink Infrastructure – If for some reason you don’t have a nice backlink structure its time to begin building one. Don’t go overboard or you can find yourself BANNED if the process doesn’t look natural. So make sure to keep things natural and create some beautiful backlinks. Check out my 30 Day Guide or Backlink Ideas sections for hundreds of quality backlink building ideas.

[tags]deindex google, banned google, sandbox google, help with google, index issues google[/tags]

« Previous PageNext Page »