ad: Annual 2024 Now Open For Entries!
*

How to blend creative design with strong search performance: SEO for JavaScript

Published by

As designers and developers increasingly build robust, app-like experiences into their websites, it’s no surprise that JavaScript use has proliferated across the web. It’s a client-side language that allows for many of the rich interactive features found on websites today. Unfortunately, JavaScript can also pose issues for search engine optimization.

A few years back, you would need to choose between having a swanky JavaScript-powered website or being ranked well on search engines. Luckily, in 2022 the relationship between JavaScript and SEO has improved a fair bit — it is possible to use JavaScript and still rank well in search, if you follow a few best practices.

Here, Ali Habibzedah, CTO of technical SEO platform Deepcrawl offers 10 things to watch out for if you want to maintain a site that is both creatively designed and able to meet your organic search goals.

JavaScript SEO best practices for content

*

Title tags, meta descriptions, and version control

It’s not exactly big news that your title tag and meta description are among the very first impressions your page makes on search engines (and users)..

Make sure each page has a unique and descriptive title tag that is not too long. Title tags that extend beyond 70 characters or so are usually clipped in the search engine results pages (SERPs).

Try to use relevant keywords about your subject matter but avoid looking spammy and packing in too many keywords, as blatant keyword-stuffing could get your site penalized by search engines. Note that meta descriptions are not the only source from which SERPs descriptions are populated—  search engines look to display content that is most relevant to the user’s query terms.

When it comes to related JavaScript implementations, don’t rely on cache-control headers to manage your content’s versioning. It’s better to manage versions by controlling the bundle's file name via an automated build tool such as Webpack.

APIs & SEO

APIs on the web are advancing at a very fast rate and while they offer many useful features, you need to ensure your critical content is not relying on APIs that search engine bots don’t typically support. WebSockets, Workers (Standard or shared), WebRTC, WebGL, Canvas are all not supported by search engine crawlers and shouldn’t be relied on for core content delivery.

Remember that many APIs — like GeoLocation, Notifications or User Media — require user permission before they will work, so make sure your website still functions correctly when such permission cannot be provided because search engines bots cannot grant any permissions. The best strategy is to handle errors gracefully using feature detection and then, once an API is available, enhance the functionality of your website using that API.

If you’re using WebComponents (whether using light DOM or shadow DOM), be sure to check whether your final rendered markup contains the content that you want search engines to index.

Be aware that the content you add inside a custom element tag will be replaced with the markup you add to it via innerHTML or through the shadowRoot. So if you need your content to remain in the rendered HTML, take advantage of the <slot></slot> tag in order to preserve that content.

HTTP status codes

Earlier we mentioned that WebSockets and WebRTC are not supported by search engines and therefore should be avoided for serving critical content that you wish to see indexed. It’s best to stick with HTTP and use the status codes correctly. For example, don’t always return valid 200 status codes even if you’re responding with the NOT FOUND page.

Structured data

If you are using RDFa or Microdata for your structured data implementaiton and struggle with how unreadable or bulky it is making your markup (or that it simply is much harder to maintain), JSON-LD is a wonderful alternative that is very lightweight and does not need to be mixed with HTML.

Based on the foundations of schema.org, JSON-LD allows linked data to be expressed as JSON using vocabularies you might be familiar with.  Now, with the new rendering capabilities supported by search engines, it can be added dynamically via JavaScript.

If you maintain a database that contains your product data and you’re dynamically adding product data into your pages via an API, then it would be very easy to simply extend that model to also contain the linked structured data, which you can format as JSON-LD and attach to the head of your document.

JavaScript best practices for links & navigation

*

Anchors are king when linking

When crawling links, search engine bots do not look for non-anchor elements that may redirect to another view when rendering. So relying on non-anchor elements for core navigation links should definitely be avoided, if you want your site to rank well in search. Always use the anchor element and the href attribute for navigation links. It is ok if you have click handlers on your anchors but they should only be used to show enhanced functionality such as a tooltip.

Hashbang links are also a big no-no when it comes to SEO. JavaScript navigation should use the History API to be indexable either via a .pushState or .replaceState as appropriate. Hash-based URLs are only suitable for navigation within a single web page, for example, to jump from a table of contents at the top of that page’s content to a specific section’s text lower down the page.

It is best to check for any errors in API calls and if there are failures, then add a meta noindex to the document and redirect to an error page to avoid being ranked down for such issues.

JavaScript performance best practices

*

JavaScript rendering techniques

It is important to know that not all search engine bots can render JavaScript and, even if they do, they vary in how well they do it. Soyou may want to consider your options for various rendering techniques to make sure your content is indexed correctly by search engines.

Prerendering

Prerendering involves generating static HTML pages prior to those pages being requested by the client. For websites that don’t update their content often, prerendering offers a good performance for page loads since no computation is used for generating the pages when they are being requested.

Server-side rendering

Whilst opting for pre-rendering might be suitable for some websites, a large e-commerce site might struggle with this, as their inventory data might change daily. For frequently updates sites, server-side rendering (SSR) comes to the rescue. SSR allows static pages to be generated on the fly on the server, so any database changes are reflected immediately. Whilst you get a lot more dynamism using this technique, you are relying on your server capabilities for performance.

Dynamic rendering

Whichever of the above methods is the right one for your website, you may decide that you still want to go ahead and serve your JavaScript-based pages when your site is accessed by users via a browser but only serve the prerendered or SSR version of the page to search engine bots.

Detecting the requested user agent and subsequently serving different content based on whether the user agent is a search engine bot or a normal user is referred to as dynamic rendering. Today, dynamic renderingis considered a foolproof way to have dynamicJavaScript-powered websites while still having your content easily indexed by search engine bots.

Lazy loading images

*

Reducing page load time is increasingly a differentiating factor both for UX and SEO.One of the most common large assets used on websites are images. So it can be helpful to reduce page load time by hiding images that are not in the viewport when a user enters a page and load them later on. This is referred to as ‘lazy loading’.

Generally, there are three approaches for lazy loading images:

  • Using JavaScript events, resize, scroll, and orientationChange
  • Via the Intersection Observer API
  • Browser native lazy loading

JavaScript events for lazy loading

This approach is only really needed if you have to support older browsers or browsers that have not yetimplemented the Intersection Observer API.

This technique involves traversing through initially not loaded images as the resize, scroll, and orientationChange events are fired and checking their position using getBoundingClientRect().top and getBoundingClientRect().bottom in order to see if they are in view, at which point you would actually load the image. There is no need to repeat this process once the images are loaded.

A similar technique can be applied to CSS background images where the initial background is a lighter image or a background colour, but when the element is in the viewport, the actual background-image is set using a class.

Intersection Observer API for lazy loading

Most modern browsers support this feature now which makes lazy loading much, much easier. When DOM contents are fully loaded, you create a list of all the lazy images using querySelectorAll. Then, for each of the images, you create an instance of IntersectionObserver and call its .observe() method.

Within the observer you get two arguments: the observer entries and the observer itself. Now you can check if any of the entries is in view using its .isIntersecting property. When the value is true, you can set the src attribute of the image and call the .unobserve(target) method on the target of the entry to stop observing the image after it was loaded.

Native lazy loading

If you are lucky enough to only care about modern browsers, a very easy way to lazy load images is via the native loading attribute on images (loading=”lazy”) which automatically skips the loading of images from the critical loading path until the user’s scroll position gets close to them.

Final thoughts on JavaScript SEO best practices

*

While JavaScript can introduce search engine optimization issues to your website, it’s not impossible to overcome them. These days, with the right practices in place when it comes to JavaScript, it is possible to have both a dynamically designed, app-like website experience for your users and still rank well in the search engines.

SEO is a constantly evolving discipline, however, so staying up to date on the latest search engine developments will be key when it comes to ranking your creative JavaScript sites so they can be found by more visitors.

Comments

More Workshop

*

Workshop

How to Hire a Furniture Designer

In the realm of interior design, furniture plays a pivotal role-it’s the bridge between aesthetics and functionality. Whether you’re renovating your home, office, or commercial space, selecting the right furniture designer is critical....

Posted by: Creativepool Editorial
*

Workshop

How to hire a 3D Designer

In today’s increasingly digital world, the demand for 3D designers is on the rise. Whether you want to visualize a product, bring a virtual world to life, or create captivating animations, choosing the right 3D designer is critical. With so...

Posted by: Creativepool Editorial
ad: Annual 2024 Now Open For Entries!