Ad

How Do I Correctly Deal With 404 HTTP Errors When Using An SPA And No Server-side Computation?

I am currently using Vue.js on a website project of mine.

The server that returns this Vue.js SPA will not be capable of computation or checks, and will only serve the static resources required to run the SPA in the browser, where all computation will then take place.

For SEO purposes, I want to ensure error pages are handled correctly. Currently, every URL returns a 200 OK and serves the SPA which can confuse search engines for pages that are actually supposed to be invalid. What would be the correct way of telling both users and search engines that the page is invalid if I cannot change the response the server provides?

For context, the SPA will get data from an API on another domain.

I have looked at other questions that are similar, but they normally recommend server-side checks, a dedicated 404-page redirect or a soft-404 page. Server-side checks and a dedicated 404 page will not be possible, and I have read that soft-404 pages are disliked by search engines.

Is there any proper way to go about this?

I have seen this post, but it is quite old and only suggests a no-index tag, which still makes the page valid in the eyes of search engines, just not indexable.

Ad

Answer

You can't return a 404 error without any server-side computation/rendering, because the fact that the resource/page wasn't found relies on some logic that only gets executed on the client-side in your case. Hence, your only options are the following:

  • If the resource wasn't found, redirect the user to a pre-defined 404 page (that returns the correct HTTP status)

  • Blacklist paths that are invalid inside your proxy, or whitelist those that are valid, and proxy to a 404 page on all other paths

  • Manually create a sitemap with your valid pages/routes

None of these options are optimal if your project grows or you have dynamic routes, but those are the limitations of client-side rendering. Hence I don't think there's a good answer to your question, since this is a well-known limitation of client-side rendering, and one of the main reasons why projects that care about SEO prefer server-side rendering.

From an SEO perspective: As long as you add all your valid pages to a sitemap, and don't link (anchor tag) to the invalid ones on any of your other pages, no search engine crawler will ever crawl or index these pages.

But if you really care about SEO and have a dynamic app with hundreds/thousands of dynamic links that cannot be added to a sitemap manually, I'd advise you to switch to a production framework like Nuxt.js or Next.js because they offer what you're looking for and much other SEO features out of the box.

Ad
source: stackoverflow.com
Ad