4 Crawlability Problems to Look Over

You have finally launched your site and greatly satisfied with its design and content.

Question:

But why is it taking so long for major search engines such as Google and Yahoo to index your pages? Or why hasn’t your rankings increased for your targeted keyword phrases that aren’t even competitive.

Potential Answer:

Your website could be having crawlability issues.

What is Crawlability?

Search engines all have what we call spiders, which basically crawl around the online space reading and picking up content from web pages. Those pages would then be indexed appropriately depending on relevance, website’s structure and credibility. To allow these spiders to easily navigate your site’s content so eventually you would rank well for relevant keywords in SERPs; your pages need to be crawlable.

4 Things to Check Whether Your Website is Crawlable

1. Is your web page content in text?

Search engine spiders are pretty dumb and can only read text. Anything fancy such as javascript (Ajax), Flash, image or video will be simply ignored. To solve this problem without sacrificing your visual appeal, balance these elements with adequate text copy.

If you do use images, remember to take advantage of the alt tag, which is the text box that pops up when a user scrolls over. Its main purpose is for blind people with screen readers, so try and put a detailed description in between these tags.

For videos, provide a transcript and fill in the meta data also with a detailed description including your targeted keyword phrases.

2. Is your server functioning properly?

If the search engine spider crawls along and you are having server issues, it will not be able to access your site. Though it would come back later to try again, this will definitely delay your search engine optimisation process.

3. Are you blocking the spider?

Take a look at your robots text file or robots meta tag because this is where the developer defines which files the search engine cannot crawl and index. Some of your key pages could have been blocked by accident so it is always best to double-check this section.

4. Have you created and submitted a sitemap?

A sitemap is great to create because it lists all existing pages that fall under a domain name. You can really see how large your actual site is and locate any broken links that needs to be fixed. Other than that, submitting your sitemap.xml to search engines will help fasten the indexing process.

If you are clear from the above checklist, perhaps you should start looking at building your website’s linkage. Submit your pages to relevant sites or nicely ask your business partners to place a link in their homepage back to yours.

Need a little help with your SEO campaign? Visit Wiliam’s siterank services and ask questions today!