"be consistent" | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
A couple of thoughts:
1. have you profiled the code? Is the time in the execute or the iteration? 2. If it is in the execute, are the tables properly indexed? 3. You don't say which DB you are using, some have statustics gathering agents that optimize querys. For example, Informix allows you to update statistics on tables and columns. I've have seen this process reduce query time from heat death of the universe down to 3-5 seconds for a complex join or view. 4. Most commercial DBs have an explain plan feature, which tells you how the query optimizer will try to quickly return data, and can be tuned. 5. If it is the iteration that is taking the time, I would suggest using the perl profiler to optimize the code in the loop. Hope some of this helps, -pete "Pain heals. Chicks dig scars. Glory lasts forever." In reply to Re: Big database queries
by dreadpiratepeter
|
|