Beefy Boxes and Bandwidth Generously Provided by pair Networks
Pathologically Eclectic Rubbish Lister
 
PerlMonks  

Re: Database searches with persistent results via CGI

by joealba (Hermit)
on May 10, 2002 at 00:52 UTC ( [id://165567]=note: print w/replies, xml ) Need Help??


in reply to Database searches with persistent results via CGI

You've got some good ideas in your solution, but you may want to attack the weakest part of your search first -- the name search. DBIx::TextIndex might be a bit difficult to implement, but it looks quite powerful.

My only other suggestion (aside from the other wise monks' posts) is that you should store your saved search results all in one table with a key id, rather than separate tables for each search. If keyed properly, it should be just about as fast in searching -- and much easier to maintain/purge.
  • Comment on Re: Database searches with persistent results via CGI

Replies are listed 'Best First'.
Re: Re: Database searches with persistent results via CGI
by Anonymous Monk on May 10, 2002 at 12:10 UTC
    1.5M records is not *that* much data. Indeed you should attack the weakness of the problem which is the search.

    You probably need to make an index on the search columns though. Yes, indexes help with LIKE searches too if the database is sophisticated enough. PostgreSQL is an example of such an RDBMS. You may need to tune other DB parameters to give the backend more resources to work with. If the DB is swapping out to virtual memory alot that makes for a HUGE performance hit. With an RDBMS the goal is to have the table(s) cached in RAM, not read from disk. Perhaps a few more bars of memory and/or some configuration changes and a couple of well place indexes are all you need.

    You may want to look into the use of "OFFSET" and "LIMIT" clauses in your SQL statement. The backend still has to run the whole query, but you only get the rows you need back. If you are fetching the entire table in Perl just to output the last 25 rows you are wasting a LOT of CPU, let the DB do that for you because it's optimized to do so.

    If by chance you are on PostgreSQL . . . Have you done a "VACCUM ANALYZE" which allows the query optimizer to gather statistics on past searches which it uses to find better search paths?

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://165567]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others browsing the Monastery: (4)
As of 2024-04-16 13:00 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found