pic

SEO Management

Website Evaluation: This process provides you a "due diligence" report on your website, examining it for various obvious deficiencies from the Search Engine Optimization perspective. This process covers identification of various issues like page size, browser compatibility, spider friendly-ness etc.

Robots.txt file
While evaluating a website, one of the first things you should look at is the robots.txt file. The Robots.txt file consists of set of instructions for search engine robots. It includes terms like "User-agent", "Disallow" among others. "User-agent" indicates the name of the spider. For example, "Googlebot" is the spider name used by Google; the "Disallow" directive give instructions what to index and what needs to be skipped by spiders. The "Disallow" directive can include a filename, a directory or even an html page name. You may even use of wildcards like '*' as explained below. Always check if the site has a valid Robots.txt file. Always ensure that the Robots.txt file is kept in the root folder of the website.

 

Navigation Structure, Directory Structure
Good navigation on your website helps the user and the search engines understand the structure of your site. It is very practical to have two sets of navigation: one for the users of the site and the other for the search engine spiders. If you use image maps for your main site navigation you should consider switching to standard HTML hyperlinks or your site would most likely not get spidered. If you want to keep the image maps you can, but you should add another navigation scheme to your site that uses only standard HTML hyperlinks.

 

Number of pages website should contain
"How many pages should be my website have?" is one of the most common questions asked. There is no magic number, really! The more the merrier, but the quality of a page is of utmost importance. Your pages must be very informative and content rich. You must make these pages with the "reader" in mind, and provide crisp and relevant information on the page.

 

Search Engine incompatibility features
This is an important step during the website evaluation process-- it tells you what to avoid! Are you using a lot of JavaScript or any other scripts in the pages? If you have links within the JavaScript's, SE's will not be able to spider them and hence it's a good idea to have all your pages in a sitemap and have all your pages link to this sitemap.

Avoid using heavy images; it takes a lot of time to load and may not go well with the spider. Also, spiders can't read the images or objects like Java Applets. Even though Google has recently started to recognizing and indexing the Flash files, it is still safer to replicate to reflect the information otherwise, so that other spiders do not miss out on that. In short, avoid Java Applets, Flash and heavy Images.

 

Website Content
Building your website content is a key function in the SEO process. The quality of your website content is paramount importance to not only the user of your site but also to the search engines. About 200-500 words of unique, great quality and keyword-rich content can give a big boost your search engine rankings. Always try to write content for both search engine spider as well as the user. There is a very delicate balance between not going overboard with using keywords and writing content that would make great reading for the casual browser. Write content in such way so it will cover all your keyword phrases as well as marketing copy to create the right impact on the users mind to purchase your product or services.

 

Dynamic Website
Dynamic websites are those that generate web pages on the fly (from the database or elsewhere). Typically, these online retail stores have dynamic pages, which pick up product information and pricing from a database. You can see dynamic pages from the URLS - pages that have symbols like '?', '&' in their URL's are dynamic URL's. Search engines do not index dynamic and some other characters in the URL link. What this means is all the pages that have this URL and the pages that follow will most probably not be indexed and your potential customers may never find them.

 

Frameset
Search engines do not index framed sites very well. In fact, search engines do such a poor job of indexing frames that we recommend redesigning your site without them if you want to get good listings in the search engines. As we know, Frames are among main search engine incompatibility features in a website. If you must use frames, there is one way you might index your website and that is by making use of the <noframes> </noframes> TAGS inside the body.

 

Broken Links and Redirect
Are there any broken links or Redirects? Pages with 404 errors are considered as a broken links. If a spider finds broken/dead links on your website it might not return back again. We recommend removing such broken/dead links as soon as possible.

 

Sitemap
It is highly recommended to have a sitemap containing links pointing to all the pages on the website. The sitemap should be linked from home page as well as from every page of website. Sitemaps also act as an access point for the spider to crawl all the pages on the website.

 

Keyword Search: A thorough keyword analysis based on your business offerings and website audience needs. We provide a comprehensive report on what keywords would be best suited for your website to drive targeted traffic to your site.

  • Create a keyword list from a variety of sources like: website meta tags, competitor websites, website logs and records. Web logs ad PPC campaign results can provide some information. Find out which products/keywords convert most and use those keywords as the base.
  • Use WordTracker to determine search volumes for the term in the previous month. You can ignore separate entries for keywords that use all lowercase vs. those that start with an uppercase letter (e.g., shoe and Shoes are equivalent). List this information in an Excel spreadsheet.
  • Assign importance to keywords based on previous month search volume. This will help you determine which keywords will be most successful if optimized.
  • Do a link popularity search for each of the keyword terms and determine which websites are shown and what their PR is. This should give you an idea of what PR is required to rank at those levels (in addition to equivalent and good content).
  • The combination of the Link Popularity information and the demand for the keyword should help you determine which keywords you want to select for the pages:

* The demand for this keyphrase

* The PR required to be ranked in the top 10

Meta Tags Optimization: Recommendation on your meta tag optimization based on your page content and the keyword analysis we have conducted.

  • Title meta tag
  • Description meta tag
  • Keywords meta tag
  • Robots meta tag
  • Refresh meta tag

Search Engine Optimized Copy writing: Pertains to writing content for your website that communicates your business message effectively to the human reader while being very search engine friendly so you can rank high up in the search results.

Link Popularity and Link Exchange Programs: Services to obtain links from related businesses to point to your site in a reciprocal or an "inward-only" manner. Links pointing to your site from related theme websites with good page rank help you get a better page rank which in turn helps you show up higher in search engine results.

Web Analytics: Post optimization, web analytics helps you monitor and tune your optimization efforts to attract more targeted traffic.

Google Co-op: Services for implementing, managing and updating Google Co-op functionality on your website. Google Co-op provides an opportunity for leveraging your existing site traffic and attracting repeat business from existing user.

Copyright @2008 QuintElements Group - Privacy Policy : : Contact Us