Beefy Boxes and Bandwidth Generously Provided by pair Networks
Syntactic Confectionery Delight
 
PerlMonks  

quick CD burner Selector/Optimiser

by dominix (Deacon)
on Dec 14, 2003 at 23:07 UTC ( [id://314706]=CUFP: print w/replies, xml ) Need Help??

here is a script that help to burn plenty of data you want to keep all together in respective directories ~like MP3 :-)
basicly it just arrange size of directories to best fit on a CD, trying to fill left space on CDs with smaller directories I've saved 8 CDs over 72 using this tip
#!/usr/bin/perl use strict; use warnings; my ($k,$nom); my $maxsize = 690_000; my $makeiso = 0; my $pattern = "*"; my @CD = ([0,[]]); my $i; my $inserted = 0; use Data::Dumper; for (@ARGV) { $maxsize = $1 if /-ma*x*s*i*z*e*=(\d+)/; $makeiso = 1 if /-mkisofs/; $pattern = $1 if /-pattern=(.*)/; die "usage $0 -max=xxx (maxsize in K) -mkisofs -pattern='*.gz' -help" if (/-he*l*p*/) ; } my $pid = open(SOURCE, "du -sk $pattern|sort -nr|") or die "Couldn't g +et SOURC while (<SOURCE>) { chop; ($k ,$nom ) = split /\t/,$_,2; $inserted = 0; if ($k > $maxsize) { print '"' . $nom . '"' . " out of Range $k\n"; next; } foreach $i ( 0 .. $#CD ) { my $rcontent = \@{$CD[$i]} ; if ($rcontent->[0] <= ($maxsize - $k)) { $rcontent->[0] = $rcontent->[0] + $k; push @{$rcontent->[1]},$nom; $inserted = 1; } last if $inserted; } if (! $inserted) { push @CD,[$k,[$nom]] }; } close(SOURCE) or die "Couldn't close: $!\n"; if ($makeiso) { foreach $i ( 0 .. $#CD ) { my $rcontent = \@{$CD[$i]} ; print "mkisofs -o cd$i.iso " . join " ", map {qq/'$_'/} +@{$rcontent->[1]}; print "\n"; } } else { $Data::Dumper::Varname="CD"; print Data::Dumper::Dumper(@CD); }
it's up to you to grep the output for setting your burn process.
on the advice of b10m I've make a mkisofs output
hope you find it usefull

Replies are listed 'Best First'.
Re: quick CD burner Selector/Optimiser
by Aristotle (Chancellor) on Dec 15, 2003 at 04:24 UTC
      I don't knew about that modules (there is so many ...), sounds very complete compared to my script
      but my aim was efficience (?) and simplicity
      did I do it right ?
Re: quick CD burner Selector/Optimiser
by b10m (Vicar) on Dec 15, 2003 at 12:47 UTC
    This is something usefull. I myself have certain directories to backup at and am always messing with the CD layouts :) Three points I'd do different though:
    1. I'd like to see a command line parameter I could use to set the maxsize variable, for sometimes I use DVDs for backing up, sometimes CDRs/
    2. I'm a lame guy and like the "-h" (human readable) in most apps. I'd probably like the input/output better stating the size in megabytes, rather than bytes.
    3. The output (through Data::Dumper) is ok for testing purposes. If you, however, want to actually use this program's output directly say with "mkisofs", I'd probably want a different output ;)
    --
    b10m
      I like the idea of the raw output of Data::Dumper, but you're right it helps to have an generated mkisofs output. So I'll make a special cmd-line option for it soon.
Re: quick CD burner Selector/Optimiser
by spartan (Pilgrim) on Dec 19, 2003 at 20:52 UTC
    I really like this. I could have used it at the last 2 places I've been at. The only annoying thing is that it does not handle directories out of the range of the size of a CD.
    I know it's easier to say "hey your thingy doesn't do this really neato thing" rather than stating the same, and then giving a solution. Alas I am far from providing solutions, but I though you'd like to know this is a nice little program.

    UPDATE:
    Oh for heavens sake... Had I taken the time to look a little more closely, I'd have seen that changing du -sk to du -dk gives a MUCH more granular look at the directory tree below where you invoke the script.
    Good stuff!


    Very funny Scotty... Now PLEASE beam down my PANTS!
      at least the script gives you the name of the directories you have to split ...
       du -dk what version is that ? did I got a so old distro ;-) that I haven't got this option ??

        my GNU du doen't have a -d option either, but on Solaris -d means 'Do not cross filesystem boundaries.' the equivalent GNU option is -x or --one-file-system.

        i've been toying with this backup thing for a couple of days, my new work provided home machine came with a DVD+RW drive that i want to put to some good use. but i have to admit i think it's much better/cheaper nowdays to get an extra HD for backups. mounting another 120G and using rsync beats the snot out of trying to fit things in 700M or 4.7G chunks.

        but i've still been looking at the problem from the angle of 'it is easier to send my sister a DVD than a HD' and i'm working on a little script to do just that.

        some tidbits from my digging...

        • DVD it appears has a 2k block size. assuming some sort of regularish filesystem on the media it might be wise to support block size in the calculations. a 1b file will take at least 2k on the disk.
        • i haven't found a way to calculate the iso9660 filesystem overhead except by running mkisofs with --print-size. if anybody knows the overhead calculations please share =).
        • iso9660 has a max-filesize limit, it's either 1 or 2 G, i forget which. so you won't be able to back up that 3G file on a DVD the obvious way.
        • mkisofs will automatically split those large files into filename_00, filename_01, filename_02, ... chunks. you'll have to put them back together yourself.
        • if you have RW media and multisession capability there is a neato patch for mkisofs that will do incremental backups. something like:
          mkisofs -r -J -root=backup_00 /path plus mkisofs -r -J -root=backup_01 -old-root=backup_00 /path
          if you burn the first iso, then the second as multisession when you mount the disk backup_01 will have the most recent files, plus hardlinks back to files that didn't change between backups. i haven't quite figured this out yet since DVD+RW multisession isn't working on Linux 2.6.0 yet (should be fixed soon).
        • it might be more appropriate to use the -H option over -h. -h is 1024*1024 wise, -H is 1000*1000 wise. the 4.7G DVD is in -H units, not sure about 700M CD but i would guess it's in -H units also.

        if anybody knows of any interesting and not too difficult algorithms for packing i would love to know about them. seems Algorithm::Bucketizer only has a couple that are either simple or random. i'd like to find something with a bit of heuristics added.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: CUFP [id://314706]
Approved by davido
Front-paged by /dev/trash
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others romping around the Monastery: (5)
As of 2024-04-25 10:12 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found