http://qs321.pair.com?node_id=200006

Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

I am opening a file that is new-line delimited , I am doing a while loop over the file handle. I am trying append each line to a variable until i've passed over 20 lines. below is my code.
open(FILE, $listfile); while(my $user = <FILE>) { $user =~ s/(\r|\n)//; # windows compatibility $userout .="$user,"; $lcount++; if ($lcount == 20) { print PIPE "$userout"; $lcount=0; } next if !$user; }
any help is appreciated

Replies are listed 'Best First'.
Re: while loop over filehandle
by BrowserUk (Patriarch) on Sep 23, 2002 at 05:28 UTC

    Try this and see how you get on. Note the comments.

    #!perl #Adjust for your system use warnings; use strict; # You did just forget these 2 lines? my $lcount=0; # Of course you'll need to declare your variables + with my my $userout=''; my $listfile = "your filename"; # Always check that the open worked. open(FILE, $listfile) or die "Couldn't open $listfile: $!\n"; while(my $user = <FILE>) { # $user =~ s/(\r|\n)//; # windows compatibility chomp $user; # Does what you were trying to do above but works on +Macs as well $userout .= $user . ','; # Does the same. $lcount++; if ($lcount == 20) { print PIPE $userout; # no need to interpolate vars to print th +em # I assume this means you want to print them to ALL to PIPE 20 + at a time $lcount=0; # In which case you need to clear $userout $userout = ''; #If you only want the first 20 uncomment the next line # last; } # next if !$user; #No idea what you thought this did? }

    Cor! Like yer ring! ... HALO dammit! ... 'Ave it yer way! Hal-lo, Mister la-de-da. ... Like yer ring!
Re: while loop over filehandle
by spartacus9 (Beadle) on Sep 23, 2002 at 05:09 UTC
    instead of using the regex to remove the \r and \n, try:
    chomp($user);
    that will remove any newline characters from the string if they exist.

    i'm not sure what your "next if !$user" line is doing. since you're already inside a while loop, that logic should already be taken care of without having to check the $user variable again.

      Just an additional note on that - while your regex is better written as chomp($user) you can make an improvement by using a character class instead of alternation:

      You wrote $user =~ s/(\r|\n)//.

      You *meant* $user =~ s/(\r|\n)$// since that newline should only occur at the end of the string.

      That group is capturing but it doesn't need to $user =~ s/(?:\r|\n)$//. Don't use capturing where it's not needed.

      In fact - grouping is wasteful for single character alternation. Use a character class instead $user =~ s/[\r\n]$//.

      And even better use chomp() chomp($user).

Re: while loop over filehandle
by Aristotle (Chancellor) on Sep 23, 2002 at 07:50 UTC
    As far as I can tell, your code should work, but it's not very Perlish. You can easily lose the loop control variable there and clarify the happenings. chomp can remove the end-of-lines on an entire list at once. join is ideal for concatenating a list of strings. In all, I'd write this like so:
    use constant PER_LINE => 20; my @out; do { @out = (); while(<FILE>) { push @out, $_; last if not 1 .. PER_LINE; } chomp @out; print PIPE join ",", @out; print PIPE "\n" if @out; } until @out != PER_LINE;

    See Mark-Jason Dominus' excellent Program Repair Shop and Red Flags article series on Perl.com for many introductory pointers on writing good code.

    Update: D'oh! Fixed a tiny but nasty thinko in the code.

    Makeshifts last the longest.

Re: while loop over filehandle
by kabel (Chaplain) on Sep 23, 2002 at 08:42 UTC
    you do not need to explicitly count the line where you are. the variable $. holds the current input line:
    open (FH, $file) or die "$file: $!"; # need to do it once to initialize $. push @content, scalar (<FH>); while ($. < LINE_COUNT) { push @content, scalar (<FH>); } close (FH);
Re: while loop over filehandle
by Flexx (Pilgrim) on Sep 23, 2002 at 14:08 UTC

    Hi monks!

    First of all, do not use \n\r to cope with/match foreign systems' newlines! chomp appears to be better but it's not (that is, it works only for files in a somewhat compatible format).

    Please, please read this node for an expanation of this.

    Another trick when iterating over something of arbitrary size, while spitting something out (or doing whatever) at each chunk of data of a given size, use the modulo function (%). This function repeatedly goes zero whenever the chunk size is hit, and is nonzero in between. Here's my try on this:

    # tested, but might not exactly match what anonymous # was doing (added newline output after each chunk, # blank after comma seperator,...) use constant CHUNK_SIZE => 20; open(FILE, $filename) or die "Can't open $filename: $!\n"; while(<FILE>) { s/\012(?:\015)?|\015(?:\012)?//g; push @users, $_; unless($. % CHUNK_SIZE) { print join(', ', @users), "\n"; @users = (); } } print join(', ', @users); # Output what's left. close FILE;

    Of course one could also say unless(@users == CHUNK_SIZE) in this case (since scalar @users actually works as a subcounter here). But, oh well, I just wanted to demonstrate a neat trick. But when you want to interrupt a loop every few passes, using only a total counter modulo really comes in handy. ;) When you think "do x on every nth line/iteration/pass", think modulo.

    So long,
    Flexx

    Updated: Added last paragraph after first submitting. Made braces in s/// noncapturing (++diotalevi).