Thanks to its advantages in terms of user interaction and experience — from creating dynamic content, animating images, controlling multimedia content, and more — JavaScript is increasingly used in eCommerce sites.
However, with the growth in JavaScript usage comes more challenges for digital marketers, especially those handling search engine optimization (SEO).
Getting Google to index Javascript content is among the most common challenges in technical SEO today. Despite following all the rules of the game, many site owners still struggle to drive organic traffic, not realizing that their JavaScript SEO could be the reason why.
As an eCommerce digital marketing agency, we’ve been closely monitoring how this type of coding affects our clients and their overall SEO marketing strategies.
In this guide, we’ll share our findings and tips on overcoming some of the biggest challenges with JavaScript SEO, by explaining:
- What JavaScript is and how it works
- Why JavaScript SEO is different from traditional SEO
- And which best practices we recommend for eCommerce sites
What is JavaScript & How Does It Work?
JavaScript (or JS, for short) is an extremely versatile programming language that can be used for everything from developing desktop applications to creating dynamic and interactive websites. It’s one of the three core technologies on the internet, along with HTML and CSS.
JavaScript makes web pages “come to life,” so to speak. It’s used to personalize websites, deliver notifications, load new content as your scroll toward the end of a page, and more.
Popular websites like Amazon, Walmart, and Target rely heavily on JavaScript, thanks to the many advantages it offers, including
- Improving website performance
- Making websites more user-friendly
- Enhancing overall user experience
Additionally, using JavaScript can result in increased sales for eCommerce sites. For example, studies have shown that adding a live chat widget to an eCommerce site can increase sales by up to 45%.
Common JavaScript Frameworks
Some popular JavaScript frameworks are React, Angular, and Vue.js. We’ll discuss a little about each below:
React
With more than 10 million websites using the software, React is one of the most popular JavaScript frameworks out there. Since its introduction to the market in 2013, it’s gained popularity as a scalable, fast, and reusable framework for building user interfaces that are both interactive and responsive.
React supports enterprise applications and is used by Netflix, Airbnb, and Facebook, among others. It supports the View piece in the Model-View-Controller (MVC) concept and leverages front-end development for building rich user interfaces for websites and apps.
React gained popularity due to its ease of use, especially since it only requires software engineers to have a basic understanding of HTML and JavaScript to use it.
Vue.js
Vue.js quickly gained popularity as a front-end JavaScript framework that is versatile, approachable, and fast. Its main focus is on building SPAs, or single-page applications.
The main benefit of using this framework is its incremental use model, meaning the framework can be used in limited amounts, whereas others require full adoption. Like React, Vue also supports the View layer of the MVC paradigm.
Additionally, the framework allows developers to write page templates in standard HTML. These templates (and other code) can then be packaged into components to streamline updates and for easy reuse.
Angular
Angular (also known as AngularJS) is a Typescript language developed in 2010 that includes JavaScript. It’s one of the most popular frameworks used by companies such as Google, Forbes, PayPal, and more.
A comprehensive framework, Angular offers everything from data-binding to routing. Angular focuses on building SPAs that are both sophisticated and efficient and supports the MVC architecture.
With data-binding, data is automatically synchronized between the client and the database. This allows developers to save time and effort from having to define UI requests and responses.
Angular also supports dynamic rendering, which is beneficial for SEO. (We’ll discuss more of this later on.) It also lightens the load on an application’s server thanks to its reliance on the browser to build the page – leading to quicker loading times.
One drawback: Angular is written in a variant of JavaScript called TypeScript, leading to a longer onboarding time if the software engineer is not familiar with TypeScript.
How Google Processes JavaScript
Google processes JavaScript in three stages: crawling, rendering, and indexing.
Googlebot (Google’s crawler) puts pages in a queue for both crawling and rendering.
Crawling is the process of discovering and indexing web pages. During this stage, the crawler scans through the pages on the internet to find new and updated content.
When Googlebot retrieves a URL from the crawling queue by sending an HTTP request, it first verifies that you’ve permitted it to crawl by reading the robots.txt file. If the URL is disallowed, it avoids sending an HTTP request and bypasses it entirely.
Rendering is the second stage, during which the HTML, CSS, and JavaScript files that comprise a webpage are processed and turned into a format that can be easily read and indexed by Google.
Indexing is the third stage, where rendered content is added to Google’s index and has the ability to appear in search engine results pages (SERPs).
What is JavaScript SEO?
One of the biggest advantages of using JavaScript for eCommerce sites is that it can help create a more seamless, user-friendly experience for website visitors.
eCommerce websites are a fantastic example of dynamic content delivered via JavaScript. For example, online retailers frequently use JavaScript to load items into category pages and dynamically update products on eCommerce sites. This makes sense; their inventory is in a continuous state of change due to sales.
While this paints a rosy picture, JavaScript can pose a challenge for SEO.
Search engines can run into issues executing your JavaScript code, which means that certain page content may not be indexed or returned in search results. Therefore, if your website is heavily reliant on JavaScript and you don’t take the steps to prepare your SEO, you may not see an improvement in your organic performance.
In other words, without proper SEO strategies, your site may not be crawled/indexed appropriately.
If you implement JavaScript on your eCommerce site, you’ll need to pay attention to specific areas of your site to ensure that your pages are both crawlable and indexable. This technical process is called JavaScript SEO.
In many ways, it’s the same as traditional SEO. It helps your site get crawled by search engine bots, so they’re more likely to be served up to users in the SERPs. It also requires speeding up your page load times and troubleshooting any JavaScript-specific issues that arise.
However, the main difference between JavaScript SEO and traditional SEO is the technical challenges associated with JavaScript sites.
In order to process JavaScript, Google has added an extra step (rendering) to its indexing process. If there are any issues with the JavaScript code, Google won’t be able to see the content to add it to its index.
JavaScript SEO Best Practices for eCommerce Websites
In the many years that we’ve worked with eCommerce clients, our technical SEO audits have highlighted some common mistakes and many opportunities for improvement.
Many dev teams we’ve worked with often don’t consider SEO when building websites. Some don’t even know about SEO at all. They can develop great-looking websites that run fast for the user — but aren’t able to take into account what the bots need for indexing, resulting in SEO-related issues later on.
So, here are a few JavaScript SEO best practices to keep in mind when building or revamping your eCommerce website:
1. Use Dynamic Rendering
How your site renders code hugely impacts how Google indexes your JavaScript content.
Therefore, when strategizing your JavaScript SEO, you will need to have a good grasp of the different ways JavaScript rendering occurs:
- Server-Side Rendering (SSR): The rendered page is sent to the client (Googlebot, the browser, etc.). The method for crawling and indexing a website is similar to how it’s done for HTML pages, and JavaScript-specific difficulties should not exist. However, SSR can be challenging for developers and can increase server load.
- Client-Side Rendering (CSR): The JavaScript code is executed by the client (the user’s browser or Googlebot). When the client has to render JavaScript, difficulties may arise when Googlebot attempts to crawl, render, and index material.
- Dynamic Rendering: A better alternative to SSR and CSR, dynamic rendering allows you to deliver a website’s content to users who access it using JavaScript code generated in the browser — but presents only a static version to Googlebot. For this reason, we recommend using dynamic rendering on your website instead of client-side rendering. It’s pre-rendering, but just for search engine bots.
2. Properly Route Your URLs
It’s important to update your URLs when updating content.
JavaScript frameworks will use a router that allows you to map to clean URLs. For this reason, we don’t recommend using hashes (#) for URL routing. This problem is particularly prevalent in early versions of Angular and Vue.
If you have a URL like xyz.com/#something, anything that comes after the hash is usually ignored by a server.
If you use the Vue router, consider using “History Mode” instead of “Hash Mode,” like so:
const router = new VueRouter ({
mode: ‘history’,
router: [] //the array of router links
)}
3. Follow Internal Linking Protocol
One of the most common problems we’ve seen is that developers use JavaScript for links that Google cannot crawl, such as onclick or button-type links.
Here are a couple of examples:
- Ex: <a href=”/good-link”>Will be crawled</a>
- Ex 2: <a href=”/good-link”onclick=”changePage(‘good-link’)”>Will be crawled</a>
If you want Google to discover and follow your links without checking, ensure they’re in plain HTML. Good, clean, internal linking is one of the essential factors for Google.
4. Use Pagination
Another tip is to include all navigational elements in your HTML response.
You might think including your main navigation is common sense and all that is needed — but footers and sidebars contain crucial contextual links and, therefore, need to be included, as well.
Pagination is especially important in eCommerce. While allowing users to scroll a feed infinitely is good for user experience (UX), it’s not necessarily SEO-friendly. Why? Because crawlers don’t interact with your page and therefore cannot trigger events to load supplemental content.
Eventually, Google will stop scrolling or clicking the “View More” button, reach a limit and move on. This means your pages could get buried and not get crawled as often, resulting in poor SERP rankings.
We see this specific issue a lot with our clients and recommend <a href> links instead, which allow Google to see the second page of pagination.
For example, avoid using <a onclick=”goto(‘https://xyz.com/bags/’)”>. Instead, go with <a href=”https://xyz.com/bags/”>.
5. Lazy Load Your Images
While Googlebot supports lazy-loading, it does not “scroll” like we humans do when visiting web pages. Instead, Googlebot simply resizes its virtual viewport, making it longer when crawling web content. Therefore, the “scroll” event listener is never triggered, and the content is never rendered by the crawler.
So, if you have several images below the fold, it’s important to lazy-load them. That way, Googlebot can still see all your content.
Below is a sample code that shows the IntersectionObserver API, which triggers a callback. A callback is generally triggered on the visibility of any observed element. It’s also comparatively more flexible and robust than the on-scroll event listener and supported by modern Googlebot.
This code works because of how Googlebot resizes its viewport to “see” your content:
document.addEventListener(‘DOMContentLoaded’, function () {
Var observer = new IntersectionObserver(lazyload, {
threshold: 0.5
});
Var images = document.querySelectorAll(‘img’);
for(var i=0; i<images.length; i++) {
observer.observe(images[i]);
}
});
6. Serve Unique Meta Data for URLs
For page ranking purposes, users need to be served unique metadata (title, meta description, etc.) for each page.
Each page on your website should also have a unique URL. Otherwise, search engines may be crawling the same metadata for every page or none at all.
We advise against using fragments in URLs for loading new pages, because Google might just ignore them. From a user’s perspective, a visitor might check your “Our Team” page on https://xyz.com#our-team, but a search engine will likely disregard the fragment. This means that they won’t learn about that URL.
Note that SPAs utilizing a router package like vue-router or react-router require extra steps when it comes to changing meta tags while navigating between router views.
7. Allow Googlebot to Crawl Your JavaScript
It seems simple, but if you’re accidentally blocking your JavaScript (.js) files from being crawled, Google won’t be able to render and index that code.
Review your robots.txt file to confirm that your files are open and available for crawling. You may use the following code in your robots.txt to allow your resources to be crawled:
User-Agent: Googlebot
Allow: .js
Allow: .css
This will also help you make the most of your crawl budget.
How to Evaluate Your JavaScript SEO
There are many tools and guides out there for auditing your JavaScript code and ensuring it’s properly optimized for search engines:
- Google Webmaster Tools: Google’s suite of SEO tools (as known as Google Search Console) can help you optimize your site and monitor your organic performance.
- Site: Search Operator: This command will only show results from one site in the SERPs, allowing you to see what is (and isn’t) being indexed by Google.
- Chrome Dev Tools: These web developer tools are built into Chrome for ease of use.
- Ahrefs: This software suite includes tools for backlink-building, keyword research, and more.
- Semrush: This platform includes keyword research and helpful resources on all things SEO.
For more details on running a JavaScript SEO audit, check out these helpful resources:
- The Definitive Guide to JavaScript SEO | Moz
- Javascript SEO: What You Need to Know | Ahrefs
- How to Do a JavaScript Audit for SEO | The Gray Dot Company
Get a JavaScript SEO Strategy for Your Business
Whether you own an eCommerce website or a non-eCommerce one, you need to make sure that your website’s pages are as visible and indexable as possible to search engines, especially when you’re using JavaScript.
JavaScript has many advantages for your eCommerce business, but also comes with its own set of SEO challenges, even for the most advanced of marketers. Ignoring these can lead to poor user experience and lost revenue for your business.
By following the tips in this JavaScript SEO guide, you’ll be one step closer to your website using this code in the most effective way possible.
However, because every website is unique, we recommend hiring a professional to analyze your site for you. Our eCommerce SEO services are designed to help you overcome common JavaScript SEO issues, increase your organic traffic, and improve business-critical page rankings.
If you’re unsure how to address JavaScript SEO for your eCommerce website, our team can help. Contact our SEO experts anytime for a free audit and personalized proposal for your online business.
https://www.goinflow.com/blog/javascript-seo/
#seo #seoservices #SEOServicesCompany #SeoServiceProvider #seoexpert #digitalmarketingtips #marketing #socialmediamarketing #socialmedia #webdesign #blogger #onlinemarketing #marketingdigital #contentmarketing #website #searchengineoptimization #advertising #internetmarketing #marketingstrategy #entrepreneur #digitalmarketingagency #ecommerce #webdevelopment #digital #design #marketingtips #sem #websitedesign #smallbusiness #graphicdesign