There’s also the option of Dynamic Rendering, which is rendering for certain user-agents. This is basically a workaround but can be useful to render for certain bots like search engines or even social media bots. Social media bots don’t run JavaScript, so things like OG tags won’t be seen unless you render the content before serving it to them.
If you were using the old AJAX crawling scheme, note that this has been deprecated and may no longer be supported.
Making your JavaScript site SEO friendly
A lot of the processes are similar to things SEOs are already used to seeing, but there might be slight differences.
The workload like this whatsapp number list allows both the vendor and the affiliate to focus on. Clicks are the number of clicks coming to your website’s URL from organic search results.
On-page SEO
All the normal on-page SEO rules for content, title tags, meta descriptions, alt attributes, meta robot tags, etc. still apply. See On-Page SEO: An Actionable Guide.
A couple of issues I repeatedly see when working with JavaScript websites are that titles and descriptions may be reused and that alt attributes on images are rarely set.
Allow crawling
Don’t block access to resources. Google needs to be able to access and download resources so that they can render the pages properly. In your robots.txt, the easiest way to allow the needed resources to be crawled is to add:
User-Agent: Googlebot
Allow: .js
Allow: .css
URLs
Change URLs when updating content. I already mentioned the History API, but you should know that with JavaScript frameworks, they’re going to have a router that lets you map to clean URLs. You don’t want to use hashes (#) for routing. This is especially a problem for Vue and some of the earlier versions of Angular. So for a URL like abc.com/#something, anything after a # is typically ignored by a server. To fix this for Vue, you can work with your developer to change the following:
Vue router:
Use ‘History’ Mode instead of the traditional ‘Hash’ Mode.
const router = new VueRouter ({
mode: ‘history’,
router: [] //the array of router links
)}
Duplicate content
With JavaScript, there may be several URLs for the same content, which leads to duplicate content issues. This may be caused by capitalization, IDs, parameters with IDs, etc. So, all of these may exist:
domain.com/Abc
domain.com/abc
domain.com/123
domain.com/?id=123
The solution is simple. Choose one version you want indexed and set canonical tags.
SEO “plugin” type options
For JavaScript frameworks, these are usually referred to as modules. You’ll find versions for many of the popular frameworks like React, Vue, and Angular by searching for the framework + module name like “React Helmet.” Meta tags, Helmet, and Head are all popular modules with similar functionality allowing you to set many of the popular tags needed for SEO.
Error pages
Because JavaScript frameworks aren’t server-side, they can’t really throw a server error like a 404. You have a couple of different options for error pages: