Beefy Boxes and Bandwidth Generously Provided by pair Networks
There's more than one way to do things
 
PerlMonks  

Directory Recursion Disorder

by Anonymous Monk
on Jun 23, 2003 at 07:24 UTC ( [id://268068]=perlquestion: print w/replies, xml ) Need Help??

Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

I run the following code where $directory contains a directory with subdirs:

opendir DIR, $directory; for my $dir (readdir DIR) { if (-d $dir) { print "Found directory: ", $dir, "\n"; push @directories, $dir; } } closedir DIR;

But it doesn't print out any of the subdir names, any ideas why?

Inside each of these directories there will be other dirs that each contain a single text file (so format is /dir/dir/file). I'd like to create a HoA for storing the structure - which is the best way to go about doing this? I'd like to avoid overkill (no file::find) since this will only have 2 level of directories - suggestions? Thanks.

Replies are listed 'Best First'.
Re: Directory Recursion Disorder
by pzbagel (Chaplain) on Jun 23, 2003 at 07:30 UTC

    You need to prepend $directory when you do your filetest:

    if (-d "$directory/$dir")

    HTH

      yep, that's it. If I only had a dime for everytime I made that mistake... I'd have about 40 cents ;-). Thanks.

Re: Directory Recursion Disorder
by barbie (Deacon) on Jun 23, 2003 at 08:46 UTC
    You want to avoid overkill, but without File::Find. Does that include File::Find::Rule too? If not take a look it, as it does what you want ... all in one line (okay 2 if you include the use ;) ). From the synopsis:

    use File::Find::Rule; # find all the subdirectories of a given directory my @subdirs = File::Find::Rule->directory->in( $directory );

    Then to print use something like:

    print join("\n",@subdirs);

    --
    Barbie | Birmingham Perl Mongers | http://birmingham.pm.org/

      Thanks for the reply. Unfortunately File::Find::Rule depends on the following:

      use File::Spec; use Text::Glob 'glob_to_regex'; use Number::Compare; use Carp qw/croak/; use File::Find (); # we're only wrapping for now use Cwd; # 5.00503s File::Find goes screwy with max_depth == + 0

      I try to avoid such overkill when I'm just recursing down 2 directories and throwing the structure in a HoA. Depending on 5 extra modules for a seemingly simple task gives me nightmares. Thanks for the suggestion though, I'll make sure to check it in a bit more depth when I overcome my paranoia ;-).

        I try to avoid such overkill when I'm just recursing down 2 directories and throwing the structure in a HoA.

        You keep saying overkill. You said it about File::Find itself, and you say it here. I think its a bad argument. File::Find uses Carp, Exporter and Cwd either directly or indirectly. Exporter gets included with most modules, Cwd is pretty common and even still isnt that big, Carp is like Exporter, its almost always being used. (Just saying use warnings; brings it into existance. As for File::Find itself it isnt that large either. So I think your overkill argument is bogus. Especially when it allows you to write:

        use File::Find; my @files; find {no_chdir =>1, wanted => sub{-d and return; push @files,$_ } },@r +oots;

        Putting the files into the desired data structure is left as an excercise for you. I might just say that I suspect File::Spec will come in useful.

        The point here is that in trying to avoid overkill that doesn't exist you have wasted a bunch of time trying to partially reinvent a wheel. You seem to be more concerned with the startup time of your script (which is mostly determined by the OS and physical resource bottlenecks) and not the time it takes you to write the script. I bet you would have to run your program literally thousands of times before you will make up the lost time that you personally have suffered.

        This comes down to premature optimisation. You are trying to a make a process faster when you have absolutely no evidence that that process is suffering performance problems. (You cant have this evidence as you havent written the script yet have you?) So you have comitted two golden gaffs, the first is to optimise prematurely, and the second is to badly reinvent a wheel to do so. This is hardly an efficient use of your time.


        ---
        demerphq

        <Elian> And I do take a kind of perverse pleasure in having an OO assembly language...
Re: Directory Recursion Disorder
by PodMaster (Abbot) on Jun 23, 2003 at 07:32 UTC
    Does it print "Found directory: "? Is @directories empty? You should definetly add or die "can't open $directory : $!"; to your opendir call.

    update: You may wish to read `perldoc -f chdir'.

    MJD says "you can't just make shit up and expect the computer to know what you mean, retardo!"
    I run a Win32 PPM repository for perl 5.6.x and 5.8.x -- I take requests (README).
    ** The third rule of perl club is a statement of fact: pod is sexy.

Re: Directory Recursion Disorder
by Beatnik (Parson) on Jun 23, 2003 at 09:17 UTC
Re: Directory Recursion Disorder
by tachyon (Chancellor) on Jun 23, 2003 at 10:14 UTC
    my $root = 'c:'; my @dirs = grep { -d } glob( "$root/*" ); print "$_\n" for @dirs;

    cheers

    tachyon

    s&&rsenoyhcatreve&&&s&n.+t&"$'$`$\"$\&"&ee&&y&srve&&d&&print

Re: Directory Recursion Disorder
by crouchingpenguin (Priest) on Jun 23, 2003 at 11:30 UTC

    See Searching file extensions as well.


    cp
    ----
    "Never be afraid to try something new. Remember, amateurs built the ark. Professionals built the Titanic."
Re: Directory Recursion Disorder
by BrowserUk (Patriarch) on Jun 23, 2003 at 16:10 UTC

    Given that your directory structure is only 2 levels deep, there is no need for recursion. You can build your Hash of Arrays in one line.

    my %dirs = map{ $_ => [ map{ m[.*/(.*?)$] } glob "$_/*" ] } grep{ -d } glob "/basedir/*";

    The main advantage is that you build the datastructure you want directly, but I'm also not much of a fan of the Find::File* stuff.


    Examine what is said, not who speaks.
    "Efficiency is intelligent laziness." -David Dunham
    "When I'm working on a problem, I never think about beauty. I think only how to solve the problem. But when I have finished, if the solution is not beautiful, I know it is wrong." -Richard Buckminster Fuller


      Thanks. This looks like what I was looking for. I'll just make sure I understand it:

      Pseudo-code:

      use glob to expand filenames in /basedir use grep to grab all directory names for each directory name grab its subdirs pass them onto map to where there matched to the regex '.*/(.*?)$' assign the (array) result from the map to the corresponding %dir key

      I think I've got the basic idea of it, could someone explain how the subdir keys are assigned in the hash in a bit more detail? Thanks :)

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://268068]
Approved by zakb
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others chanting in the Monastery: (4)
As of 2024-04-24 02:07 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found