WordPress Tide – better coding in WordPress13 June 2018
WordPress implemented in 30% of websites. What makes it so popular?22 June 2018
Search engine operation – crawl, indexing and ranking
The crawl phase is exploration. The process is very complicated and uses programs called spiders (or crawlers). Googlebot is perhaps the most popular crawler.
After completing this stage without any errors in Search Console, the ranking process should begin. At this point, the webmaster and SEO experts must put effort into offering high-quality content, optimizing the website, acquiring and building valuable links in accordance with Google’s quality guidelines.
How can JS be SEO-friendly?
In 2015, Google withdrew its AJAX crawler system and much has changed. Technical guidelines for webmasters show that Googlebot is not blocked for searching JS or CSS files and can render and interpret websites.
Google had other problems that needed to be resolved. Some webmasters using the JS platform had web servers that displayed a pre-rendered page, which normally should not happen. Pages for pre-rendering should follow the guidelines and have benefits for the user. Otherwise, it is very important that content sent to Googlebot matches the content displayed to the user, both in terms of appearance and how they interact. Basically, when Googlebot crawls a page, it should see the same content that the user sees. Having different content means masking and is against the Google quality guidelines.
The improvement guidelines say that the best way to build a site’s structure is to use only HTML and then connect to AJAX for the look and feel of the page. In this case, you’re insured because Googlebot will see the HTML and the user will benefit from the AJAX look.