-2

I have a web application in Laravel 7 and Vuejs2 which have some pages like home, about, cities and help. these page need to be crawled by search engines. I have tried both ways, prerendering and server side rendering without success.

for pre-render there is only one package prerender-spa-plugin which is very old and not updated since 5 years.

server-side-rendering is also difficult because my web app is already designed and is going to production, it is hard to implement that now, and also not recommended to implement server-side-rendering because of 5 pages.

any suggestion are appreciated to make these 5 pages crawelable by search engine.

update: my application contain other pages and components which do not need to be SE optimized, pages like user account and profiles.

9
  • @Radu Diță I see your comment in stackoverflow.com/questions/55100614/… Commented Oct 17, 2022 at 7:15
  • You could give Nuxt a try IMO. Commented Oct 17, 2022 at 7:22
  • @kissu I could use Nuxt at the very beginning of my project, now It's to hard to implement. Commented Oct 17, 2022 at 7:24
  • Moving from Vue to Nuxt could take a matter of a few minutes if you only have 5 pages. At the end, it's mainly a wrapper and you don't need to use everything in it to still get the benefits of it. Commented Oct 17, 2022 at 7:26
  • @kissu I have 5 pages only need to be crawled by searching, the whole application has lots of pages and components. there are 6 types of user access. Commented Oct 17, 2022 at 7:32

1 Answer 1

0

Overall, I recommend the usage of Nuxt if you want something that could deliver a professional experience, on top of managing all the flexibility that you wish to have with an SSR/SSG tool.
Here is a more detailed answer regarding your currently available tools to achieve some SEO-crawlabale content: https://stackoverflow.com/a/69075962/8816585

You mentioned that you wished to keep some pages as SPA-only, this is also feasible thanks to the generate.exclude key of the configuration: https://stackoverflow.com/a/66472634/8816585

Sign up to request clarification or add additional context in comments.

2 Comments

If we exclude some parts of our app from ssr then google search console would not detect those excluded pages for SEO friendly, right? Google Search Console check every parts of our app if they are mobile responsive or not.
@Zia Google is actually able to parse client-side JS since a few years already. If you want to limit what google crawls, check this page especially the no-indexone. And overall, you probably have some auth I guess, so that should not be a problem and your actual data will stay private.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.