Beefy Boxes and Bandwidth Generously Provided by pair Networks
Think about Loose Coupling

loading a txt file into SQL Server

by SamueD2 (Novice)
on Jan 27, 2003 at 19:42 UTC ( [id://230326] : perlquestion . print w/replies, xml ) Need Help??

SamueD2 has asked for the wisdom of the Perl Monks concerning the following question:

I am new to Perl and what I need to do is to read a txt file that is tab delimited and insert it into a table on SQL Server. The file has about 50 fields Please Help

Replies are listed 'Best First'.
Re: loading a txt file into SQL Server
by Mr. Muskrat (Canon) on Jan 27, 2003 at 20:07 UTC

    This isn't really a Perl question. You should read the docs for your SQL server. Different SQL servers have different commands.

    SQL "BULK INSERT" or MySQL "LOAD DATA" for example.

Re: loading a txt file into SQL Server
by thatguy (Parson) on Jan 27, 2003 at 19:51 UTC
    Of course it would be easier to help you if there was some sample text. In the meanwhile, why not check out some of the Tutorials like split and join?
Re: loading a txt file into SQL Server
by OM_Zen (Scribe) on Jan 27, 2003 at 20:12 UTC

    open (fh,"read.txt"); while(<fh>){ chomp; @columns = split('\\t',$_); <BR><BR> #.....You prepare the insert into statement and execute it +with the @columns with error checking and close the database's handle +..... #to know more about the module you have installed that you +are using for the Database's connectivity to SQLServer , you have to +have <B> perldoc -m themodulename</B> or if you have not installed a +module first install a module (many in CPAN) and then instantiate , p +repare and execute and finish the inserts to the database's schema }

    PLEASE send the program you have written , so that it will help to give suggestions
Re: loading a txt file into SQL Server
by mlong (Sexton) on Jan 27, 2003 at 23:35 UTC

    I don't believe this is the proper place to get this information, but I'm not sure I know the full extent of the problem. Is this import you are trying to do something you are trying to make an automated process or is it something you will do infrequently and just need a tool for the import? If you just need a tool for the import, SQL Server Enterprise Manager provides everything you need.

    In that case open Enterprise Manager. Expand the "Microsoft SQL Server Nodes" branch and keep expanding until you see what looks like a list of folders. Then expand the "Databases" folder. Right-click on the database into which you would like to import the data (or create a new database first) and select "All Tasks | Import Data..."

    At this point, you'll be prompted with a wizard that will walk you through the import proceess. Click "Next>>". On the next screen, you'll see a drop down list labeled "Data Sources". In that drop down, select "Text File" which is the second from the bottom in the list on my system. From there you should be able to figure out the rest.

    Now, of course, if you want to do this regularly with a script to automate something, you will need to install perl on Windows (I use ActiveState). Then from a command line run ppm. This is similar to the CPAN shell which allows you to install modules on the fly. You can, at the prompt, type: install DBI. Once that has finished, type: install DBD-ODBC. Once these two modules are installed, you should be able to connect to a DSN through ODBC.

    The perl code to do what you want would look something like:

    #!C:/perl/bin/perl use strict; use DBI; my $filename = shift || die "Please provide a filename.\n"; my $dbh = DBI->connect( "dbi:ODBC:DSNNAME", "", "", {RaiseError => 1, PrintError => 1, AutoCommit => 1} ) or die "Unable to connect: " . $DBI::errstr . "\n"; # Open your file open( INFILE, $filename ) or die "Couldn't open file for reading.$!\n" +; while( $line = <INFILE> ) { my @cols = split( "\t", $line ); # build insert string with items in @cols my $sql = "INSERT INTO..."; $dbh->do( $sql ) or die "Couldn't insert record. $!\n"; } close(INFILE); $dbh->disconnect;

    This code is untested!!

    This method is not ideal for large amounts of data because you are doing one insert per record which means lots of overhead. In that case I would use the method I mentioned first of using Enterprise Manager. There may even be a way to automate that.

    Good luck.