This is the subhead for the blog post
That’s right. I just closed a new SEO client and have to ramp up my mojo to build a plan for this new challenge. This is particularly relevant since I recently started at a new company (PPC Associates), where I am building an SEO practice.
New SEO clients are a great challenge since many of these clients know very little about how SEO works. I say “challenge” because there can be a fairly steep learning curve in terms of educating the client and their staff about what is involved with SEO and the changes it can require. What these clients do not realize is that an SEO may suggest all sorts of modifications to a site’s architecture, content, linking practices, and in a number of cases, internal processes. You can’t even imagine some of the eyebrow-raising and gnashing of teeth this has started.
The start of a new SEO project means doing an audit designed to look at any glaring issues that may be causing problems. The last thing you want to do is start optimizing pages and find that there was a server or domain issue that could delay or even nullify the optimization efforts.
Without further ado, here are five analytical topics you should study when you first take over a new account:
Recently, I was looking at a site and checking the domain to see if there are any lingering 302 “temporary” redirects that could impact traffic. There are numerous free server header checkers that provide server status for the various domain and sub-domain iterations that need to be associated with each web property. I will check several options, including:
Almost the first thing I do when someone mentions a domain that may be subject to SEO optimization is to check the site using a search with the “site” operator. The Google query will look something like “site:www.client-domain.com” and will deliver a result that shows all of the pages being indexed by Google.
The technique will also show the formation of the search results for each of the pages from the client’s website. In the example below, you can see the “site” operator and some of the site-specific pages. The results show the number of indexed pages for the Whitehouse.gov site at 96,000 pages. You can compare this number to the number of pages actually published to find the difference. If the indexed number is unusually low, there needs to be some investigation to see what is preventing pages from being indexed.
Now that we can see the formation of the search results, we can look at how the website pages use keywords to describe the content of their pages. The page title or “title tag” is the primary indicator that tells the search engine what the page is about. We call this the “theme” of the webpage. The search crawler uses this theme to inform how to index the page. One way to look at this is to think of the search index as a very large card catalog like you find at the public library. What is different is that instead of a catalog of books, this is a catalog of webpages, each with its own theme.
At this point, I am looking to see the content and structure of the title and the snippet from the search results. The content of the search result for each page tells me if the target theme is clear enough to warrant being indexed for a popular term. Often I find that the content is more generically focused, such as having the title simply state “Services”. The company may know what is contained on the page, but there is too little information for the crawler – let alone the searcher – to figure out what this might mean and how it could be valuable.
The role of the SEO professional is to research keywords that can enhance the value of the page. The objective in selecting relevant keywords is to find those terms that have the best mix of desirability (keywords with traffic) and winnability (minimizing competition). The best keywords are those with good traffic that the site can rank for prominently within a relatively short time frame.
The search engine crawlers are much more pervasive than many of us realize. In some case, the Google crawlers can know about a website’s pages within hours of them going live. The sophistication of new crawler technology borders on big brother in terms of how quickly and thoroughly new web properties can be found and crawled.
Even with all of this sophistication, though, there is still a lag in having pages be cached and available to web searchers. One way to accelerate this process is to create and submit an XML sitemap directly to the search engines.
The sitemap is simply a list of all a site’s URLs that can be accessed directly by the crawlers. The bigger the site, the more important the need for sitemaps since pages on a large site can be located many directory levels beyond the homepage.
What data, and how the data is collected, can be critical to seeing the results for any optimization effort. It is important to be able to track the results of the efforts once the changes have been made. Some of the metrics include:
1. Overall traffic (total monthly traffic to the site)
2. Search traffic and keywords
3. Referral traffic landing pages
4. Ranking data for target keywords and preferred pages
5. Linking data and use of anchor text
There are numerous elements and tactics involved in SEO analysis; these are just a few of the important ones. Each project is different and has its own intrinsic issues. For this reason, it is essential to take a customized view for every website.