When it comes to search engine optimization, there has always been the dilemma of choosing between the quality of web content and the loading speed of your web page. The need for a trade-off between these two essential factors comes as a result of deprecated search engines algorithms. In the late nineties, search engine results were heavily based on the frequency of keyword within a given site. It therefore became a common practice for site owners to overuse keyword in order to increase their rankings within the search result. In time, this irrational emphasis on keyword was phased out in favour or the speed at which a web page loads, which had gained more emphasis. Light content websites were then considered more search engine friendly.
As time has passed, there has been a significant change in the algorithms affecting these two factors. Let us briefly highlight these, in an attempt to seek a better understanding of the more effective SEO practices.
Adaptation UX (User Experience) Technologies
Search engines have progressively embraced the principles of good UX designs. For a site to conform to good UX design, both the load speed and content offered must be exceptional. As per the Google standards, the load speed must be greater than 95% on their Speed Page Insights utility, for it to be ranked on the first page of search results. The content must also be very relevant and highly usable. For this reasons, SEO experts have been working tirelessly to adapt new technologies such as HTML5, CSS3, JQuery, MVCs and web 2.0 technologies amongst others. Additionally we have seen the emergence of responsive web app development, to cater for mobile web users.
Web Crawler Algorithm Updates
Still using Google as the ideal search engine example, we have seen major algorithmic updates such as the PANDA and PENGUIN policy updates of 2012. This has had a significant effect on black hat SEO techniques. It is now illegal to stuff keywords into the content of a website and to use Meta Tags to optimize a site. Breaking these new policies guarantee to have your site black listed from any search result list. Additionally, obvious back links have been prohibited. This of course is a measure implemented by the search engines to uphold the principles of good UX design.
Now with the little bit of additional insight this information has provided, you can better understand that both load speeds and the content of a website, are significant entities for better search rankings. Banking on one over the other is a sure way to loose. So yes speed does matter, but it’s not the only factor.
Striving for a balance between speed and content is not an easy feat. Recent search engine updates have signalled the inception of a system that rewards websites holistically, where user experience and the quality of the content are factored in the search algorithms. This would mean that more websites will require professional help in order to become SEO friendly.