I don't think the idea of merging results would work, it would take about as much power per search as a standard search engine system would (Where you effectively have a lookup table of which words appear in which documents and merge the document sets to find a particular word- sometimes with relevancy weightings and other such funky things), and the limitation of only being able to update 1,000 words a day would mean that any results which could be merged would soon get very out of date.
Any saving you made by not needing to index the documents yourself would be lost by the added network traffic and latency required to use an external site for this.
The idea of using Google's API to offload the searches may be do-able, but would rely on there being not more than about 1,000 searches per day.. and don't forget this needs to be scalable too so less than 500 would be more realistic.
Is there a Perl module to provide a nice, and efficient, search engine on a site? Just I don't think I've seen one, but I'm amazed if one doesn't exist.
|