preface

As we all know, VUE projects are single-page applications, which are not conducive to SEO optimization. Today let’s talk about SEO optimization solutions ~


First, Vue project SEO optimization

Vue SPA single-page apps are not SEO friendly, but there are solutions, as listed below

1. SSR server rendering

Server rendering, at the server node of the HTML page, has been parsed and created, and the browser directly gets the parsed page deconstruction

Server rendering: Vue has requirements for the Vue version and the server. Nodejs environment is required.

Advantages: Better SEO, thanks to search engine crawler crawler tools that can view fully rendered pages directly

Disadvantages: server nodeJS environment requirements, and the original code transformation cost is high! Nuxt.js (more pits, be ready to step on pits)

2. Static (blog, introductory website)

Nuxt.js can generate static packaging, disadvantages: dynamic routing is ignored. /users/:id

Advantage:

  • When you compile the package, it will help you process, pure static files, super fast access;
  • Compared with SSR, it does not involve the server load.
  • Static web pages should not be attacked by hackers, higher security.

Inadequate:

  • If the dynamic route has many parameters, it is not applicable.

3. Prerender-spa-plugin

If you only need SEO for a few pages (e.g. / home page, / About, etc.)

Prerender is a great way to simply generate static HTML files for specific routes at build time (it helps you parse the static when packaged)

Advantages: Easy to set up pre-render, minimal code changes

Disadvantages: only suitable for doing a few pages for SEO, if the page hundreds of thousands, it is not recommended (will be packaged slowly)

4. Using PhantomjsIn view of the crawlerDo the processing

Phantomjs is a webKit-based headless browser with no UI, just a browser,

Its click, page-turning and other human-related operations need to be designed to achieve.

This solution is actually a bypass mechanism. The principle is to determine whether the access source UA is crawler access through Nginx configuration.

If so, the search engine’s crawler request is forwarded to a Node server, and the full HTML is parsed through PhantomJS and returned to the crawler

Advantage:

  • There is no need to change the project code, according to the original SPA development, compared to the development of SSR cost is much smaller;
  • For projects already developed with SPA, this is the only choice.

Inadequate:

  • Deployment requires node server support.

  • The crawler visit is slower than the web page visit because the crawler is returned to the crawler after the resource load is completed. (Does not affect user access)

  • If the malicious simulated Baidu crawler crawls a large number of cycles, it will cause problems in server load.

    The solution is to determine whether the IP address visited is the IP address of baidu’s official crawler.

summary

  • If build a large website, such as mall class => SSR server rendering

  • If it is just normal company official website, blog site, etc. => pre-render/static /Phantomjs are more convenient

  • Use Phantomjs for SEO optimization if it is a completed project developed with SPA and the deployment environment supports node servers

Blog reference: SEO optimization scheme


conclusion

Bad times make a good man.