Beefy Boxes and Bandwidth Generously Provided by pair Networks
Keep It Simple, Stupid
 
PerlMonks  

Re^3: DBI speed up needed on MySQL

by graff (Chancellor)
on May 08, 2005 at 04:59 UTC ( [id://454964]=note: print w/replies, xml ) Need Help??


in reply to Re^2: DBI speed up needed on MySQL
in thread DBI speed up needed on MySQL

Well, then you'll probably want to get the list of column names via a separate query, store those in an array (e.g. "@colnames", then use the array to define the data query and then load the hash:
my $sqlstr = "select " . join(",", @colnames) . " from sometable ..."; my $sth = $dbh->prepare( $sqlstr ); while ( $rowref = $sth->fetchrow_arrayref ) { @rowhash{@colnames} = @$rowref; ... }
(untested, but it should give you a good-enough idea...; um, especially after the update to use "@$rowref" in the hash slice assignment instead of "$$rowref" which was wrong.)

Since you're using mysql, you could get the list of column names by a query like "describe sometable" -- the column name will be the first column in each returned row. (There are other ways, but this is pretty simple.)

Replies are listed 'Best First'.
Re^4: DBI speed up needed on MySQL
by shenme (Priest) on May 08, 2005 at 05:42 UTC
    See also the rather amazing "let's use hashes, but magically with bind" example in the DBI docs, in section "DBI STATEMENT HANDLE OBJECTS" "Statement Handle Methods" under 'bind_columns'. This uses the (most?) efficient way to populate a hash with the next record of data.
    $sth->execute; my %row; $sth->bind_columns( \( @row{ @{$sth->{NAME_lc} } } )); while ($sth->fetch) { print "$row{region}: $row{sales}\n"; }

    Also, have you quantified the amount of data you need to paw through, in kilobytes or megabytes? Perhaps the sheer quantity of data you're moving (perhaps through a network connection?) is many many megabytes?

Re^4: DBI speed up needed on MySQL
by bart (Canon) on May 08, 2005 at 09:42 UTC
    Well, then you'll probably want to get the list of column names via a separate query, store those in an array (e.g. "@colnames")...
    No you don't. Check out the property NAME of the statement handles in the DBI docs. It fetches and stores those column names into an array, when you execute the query. This property contains an array ref to it.
    my $sth = $dbh->prepare("SELECT * FROM sometable"); my $colnames = $sth->{NAME}; while (my $rowref = $sth->fetchrow_arrayref ) { @rowhash{@$colnames} = @$rowref; ... }

    However, DBI being highly optimized as it is, I don't expect any speed improvement over fetching the hashref.

    Why the need for a hash anyway? It's not ordered. So, keeping the names and data as two arrays, holds more information for you, than just the hash on its own does: the column order. I would believe that for printing, the order is very important. To loop through the arrays side by side, loop through the array index (for either array, they're the same size).

    You can always still map the field names to column numbers, the inverse of the array @$colnames, in a separate hash — and you need to do that only once:

    my %colcolumn; @colcolumn{@$colnames} = 0 .. $#$colnames;

    Stil, I don't expect all this mucking about to perform any miracles. Otherwise, Tim Bunce and co. would really have had to have been messing about. I don't think so.

      However, DBI being highly optimized as it is, I don't expect any speed improvement over fetching the hashref.

      It is highly optimized, but the docs point out that using fetchrow_hashref is slower than using other methods.

      Why the need for a hash anyway?

      That's the spec I have been given. I need a hash ref. (Another application uses it.)

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://454964]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others examining the Monastery: (7)
As of 2024-03-28 11:14 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found