Beefy Boxes and Bandwidth Generously Provided by pair Networks
Do you know where your variables are?
 
PerlMonks  

dbf_dump to apply to multiple files in a directory

by solanum (Initiate)
on Aug 03, 2014 at 10:18 UTC ( [id://1096054]=perlquestion: print w/replies, xml ) Need Help??

solanum has asked for the wisdom of the Perl Monks concerning the following question:

Hello all, am a novice to perl programming and need some (a lot?) of assistance. Scenario is as follows. I have about 1000 DBF files that need to be converted to CSV files. I understand that I can use dbf_dump to accomplish this, however, I am having issues with the coding. I have installed the xbase component and have tested it on a couple of files and that works like a charm. I need help with the coding when it comes to arrays. she here goes the code I do have *please be gentle with the remarks :) *

#!/usr/bin/perl -w opendir(DIR, "."); @files = grep(/\.dbf$/,readdir(DIR)); closedir(DIR); foreach $file (@files) { dbf_dump --fs "," "$file" > $_.csv; }

I get the following error when the script gets checked bareword found where operator expected, near "--fs" Help!

Replies are listed 'Best First'.
Re: dbf_dump to apply to multiple files in a directory
by AppleFritter (Vicar) on Aug 03, 2014 at 11:36 UTC

    Howdy solanum, welcome to the Monastery!

    You don't actually need Perl for this. The following would also work (if you're using bash or a similar shell; I don't have any experience with C shells):

    $ for i in *.dbf; do dbf_dump --fs "," "$i" >"$i.csv"; done

    If you want to use Perl, my anonymous brothers have already provided some advice. The main problem with your script is that you're trying to call an external command (dbf_dump) as you would from a shell script. But Perl isn't a shell, it's a programming language, so you have to use a function like system to run external commands.

    The error you're getting is related to that: "bareword found where operator expected, near "--fs"" tells you that Perl is trying to parse your dbf_dump invocation as Perl and failing to make sense of it.

      Unfortunately, this needed to be done in perl. However, thank you, I will keep this in mind

        You're welcome!

        Here's another tip: although you can of course use opendir and grep to get your list of files, Perl also has a built-in function called glob for filename expansions. Using that, the code that my anonymous brother posted in Re: dbf_dump to apply to multiple files in a directory simplifies to:

        #!/usr/bin/env perl use warnings; use strict; use IPC::System::Simple qw/system/; # recommended but optional foreach my $file (glob "*.dbf") { system(qq{dbf_dump --fs "," "$file" > "$file.csv"}) == 0 or die "system() failed (\$?=$?)"; }

        I've also replaced $_ with $file here -- $_ is not otherwise used and never assigned any value --, and put quotes around the CSV file's name to avoid problems with filenames that contain spaces.

Re: dbf_dump to apply to multiple files in a directory
by stefbv (Curate) on Aug 03, 2014 at 13:44 UTC

    In the spirit of TIMTOWTDI, there is another way, you can use the 'dump_records' sub from the 'XBase' module directly.

    Unfortunately there is no parameter in that sub for the output file, so we have to use 'select' to redirect the output to our CSV file.

    use strict; use warnings; use XBase; my $dbf_file = shift; die "Usage dbf2csv <dbf-file>" unless -f $dbf_file; dbf_dump($dbf_file, fs => ','); # defaults: rs=>"\n", fs=>':', 'undef' +=>'' sub dbf_dump { my ($dbf, %opts) = @_; my $csv = "$dbf.csv"; open my $fh, '>', $csv or die "Can't write to file ", $csv, ": $!" +; select $fh; dump_records($dbf, %opts); select STDOUT; close $fh; } sub dump_records { my ($name, %opts) = @_; my $table = XBase->new($name) or die XBase->errstr; $table->dump_records(%opts); $table->close; }

    Now you can add the two subs to your script and call dbf_dump like in the example.

    (The script from DBD::XBase is named 'dbfdump', without the underscore).

      I will also try this out. But it will probably take me a lil longer to go through the logic. Thank you for all your help!

Re: dbf_dump to apply to multiple files in a directory
by Anonymous Monk on Aug 03, 2014 at 10:56 UTC

    You should use warnings; use strict; at the top of all your scripts, and you should check the return values of functions for errors (in this case opendir). Here's a version incorporating that and a bit of cleanup:

    #!/usr/bin/env perl use warnings; use strict; use IPC::System::Simple qw/system/; # recommended but optional opendir my $dh, "." or die "opendir failed: $!"; my @files = grep {/\.dbf$/} readdir $dh; closedir $dh; foreach my $file (@files) { system(qq{dbf_dump --fs "," "$file" > $_.csv})==0 or die "system() failed (\$?=$?)"; }

    Note the error handling code on system isn't really needed if you use IPC::System::Simple.

      thank you very much for your help! this worked like a charm!

Re: dbf_dump to apply to multiple files in a directory
by Anonymous Monk on Aug 03, 2014 at 10:45 UTC
    system("dbf_dump --fs \",\" \"$file\" > $_.csv")==0 or die "system() failed (\$?=$?)";

    Even better with IPC::System::Simple, because it has very nice error handling: use IPC::System::Simple qw/system/;

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://1096054]
Approved by stefbv
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others avoiding work at the Monastery: (5)
As of 2024-03-28 20:37 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found