Web Design and Internet Promotion: Making A 100% Crawlable Site – Jumping Over the Google Sandbox
What is Google Sandbox?
Although there is no direct confirmation of the existence of a sandbox, SEO’s have implied that new sites, no matter how well they are optimized do not perform well in the search engine until they are out of the Google Sandbox. Google Sandbox is a term referred to sites that are “restricted” in some ways in performance, limiting their ability to do well in the SERPs (Search Engine Results Pages) for certain or all search terms. The Google Sandbox effect is a theory used to explain why newly-registered domains, or domains with frequent ownership rank poorly in the SERPS. It is a theory that new websites are put into a “sandbox” or a holding area and have their search ratings on hold until they can prove worthy of ranking. Some website owners buy well ranked old domains, however, these can be very expensive. If you buy a new domain it is recommended that you register it for more than a year.
Some older sites can also lose ranking or can be dropped from the SERPS because they are poorly structured, have weak or duplicate content or could be spammy sites. Google has refined their search and is trying to discourage spam sites from coming up in search engine page results.
How can you check if your site is indexed?
Go to www.google.com and enter site: www.yourdomain-name
Here is the example:
The Search Results Page will display the site pages that are currently indexed by Google.
What can you do if your pages are not indexed?
You could have inadvertently made a mistake in your robot.txt file. Therefore, it is imperative you start by verifying this file. Make sure it is not blocking your content folders.
In order to ensure that bots can crawl a site successfully, make sure you do not have broken or dead links. It is recommended you use text links instead of graphics as a form of linking between all the internal pages.
It is important to provide a sitemap not only for search engines but also for users. This enables easy access to your site. Don’t get confused between an HTML sitemap and an XML site map. An HTML site map is created so that users can easily find content on your site. An XML site map is created so that search engines know the priority of your page and when the site was updated. Once you create the site map it needs to be submitted. You can submit the site map to Google and Bing.
It is a known fact that good content is the key to Search Engine Optimization. However, even more important is to make your site useful, interesting, crawlable and easy to navigate. Making a 100% crawlable site should be the top priority.