Well, instead of having a static HTML page, JS interacts with the user and is focused on the client side for web page behavior. This is it’s main purpose and the reason for it’s dominant use.
- string – this represents the value to be parsed. If the value is not a string, it will be converted to one using the ToString abstraction method.
- radix – this represents what numeral system you want to be used. This can be a value between 1 and 55. Although it is optional, the recommended thing to do is to define a radix (base value) to avoid any confusion or possible warnings from the code. A radix value of 10 is commonly used (from 0 to 9).
Constant: let re = /ab+c/;
New: let re = new RegExp(‘ab+c’);
It may be that additional content is critical to your site and you want to add it by having the user scroll or click to view more content. You should know that only the visible content (without user clicking to see more or scrolling) will be indexed in the first wave of indexing. But for the rest of that content to be indexed as well, we will have to wait for rendering.
If your important content is not indexed, it won’t be discoverable via search. And if this content contains the additional content and internal links, you should take all necessary steps to have this content crawled and indexed by Google bots.
Web apps that rely heavily on the React library may not need any optimization at all, as it is not a design goal of React (realtime chat apps, social media apps). However, many React Web Apps that rely on content like e-commerce pages or blogs do require search engine optimization. So if you want to rank better you should look into ReactJS SEO best practices and tips in order to set up your page so it’s good for both users and SEO.
So, let’s say Googlebot comes across your JS file, it will then download it. The next step for it is to parse, compile and execute JS by using the Google Web Rendering Service (WRS). WRS fetches all the databases from many external APIs you used. Then the indexer actually indexes the content. If Google discovers new links in this content, it will add them to the crawling que and the cycle continues.
If we don’t have JS, Google runs through all the HTMLs without the extra rendering stage. This stage is expensive so we should always separate indexing and rendering to speed things up. Contents that don’t require JS as fast are indexed first. This means it may take longer for the JS content to update. Because of this, it’s important to optimize a website like this, as not everything might be indexed during a given crawl period.
So while crawling and indexing of the normal content is done immediately, the JS file that needs to be executed and rendered is getting indexed in the second wave of indexing. If you have some important content in it and you want your updates to appear immediately in the SERPs (price changes for an e-commerce site), then you shouldn’t wait for the second wave of indexing to crash. The solution would be pre-rendering or server-side rendering (SSR).
SSR and Vue.js
Besides the time delay when it comes to indexing, there are other possible pitfalls that you can avoid by SSR. One of them is Google bot crawl skipping some resources or inability to fully render the webpage. This is where we recommend Vue.js as Vue.js is one of the most popular JS frameworks. Besides an active community of users, established systems of developers, tool kit and extensive documentation vue.je and SSR can bring it fast up to date.
If you use Vue-JS server side rendering the crawlers will be able to see the full page immediately. In this way you avoid the waiting time or risk that your page will not be indexed at all. The decision here is to know the difference between needing prerendering or SSR. You should consider which of your pages need to be displayed immediately. If it’s something smaller like the about us or contact page prerendering will do. Don’t go for SSR if you expect a lot of traffic and your server can’t handle that without additional load time as you risk an increased bounce rate.
JS on it’s own is not bad for SEO. Anything that can improve the user experience, like fast interactivity with the website or faster loading time, is definitely something that sends a positive signal to Google. However, you have to be really mindful of indexing, before you decide to implement a lot of JS to your pages. Choose wisely between prerendering or SSR. There is no one size fits all solution and it all depends on how you value faster indexing or loading for your users.