0

I have a lot of text and I am storing it in Elastic search. Using Lucene, NLP and Wordnet filters the search is good but not as good as Google's because none of these methods use AI for the search so that it can understand questions or some of the meaning of the text. Even if Google is not using it heavily, their search will get much better than mine as time passes. I researched into technologies such as Babelnet that give you meaningful word relations with super types and sub types but then I thought:

Is it feasible to make a static page with links to other pages (one per database entry) and tell Googlebot to parse them and then when a user searches through my web application I run a search query from Google for my URL and return their results?

What are the cons? (I can see the pros I think).

4
  • Google provides this very feature for your consumption. Commented May 28, 2015 at 3:20
  • care to elaborate more? I have not heard of sites using google as a full text search used by their backend (I am not talking about just an input box). Probably google will not index static pages in the millions for a site. Commented May 28, 2015 at 9:32
  • google.com/work/search/products/gss.html Commented May 28, 2015 at 10:07
  • put it as an answer cause I didn't see that. Commented May 28, 2015 at 10:23

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.