in reply to How to maximise the content of my data CD
This is like the knapsack problem, but not quite. The difference is that you don't need to hit an exact target, you just need to not waste any more CDs then an exact target.
For instance, you have 2GB worth of files. A perfect solution would have that fitting on 3 CDs with room to spare. As long as your solution doesn't require 4 CDs - you have sufficiently solved the problem. I got into a heated debate on this exact same problem in IRC some time ago and could have swore that I posted about it here - but can't find it. There is a recent similar thread (Burning ISOs to maximize DVD space), which mentions Algorithm::Bucketizer which I haven't tried myself. I would attempt the following:
- $buckets = ($total_size / 700) + 1
- Order files by size in descending order
- Round robin files (1 per bucket)
- When you encounter first file that will not fit, stay with that bucket but continue down the list until you find one that fits
- On the next bucket, start back at the top of the file list
- Wash, rinse, repeat
Cheers - L~R
Update: As pointed out below, perfect solutions that exactly match (or even very nearly match) a whole number of CDs will wind up costing you 1 extra CD. That's why the +1 as the first bullet.
|
---|
Replies are listed 'Best First'. | |
---|---|
Re^2: How to maximise the content of my data CD
by Eimi Metamorphoumai (Deacon) on Feb 25, 2005 at 14:28 UTC | |
by Limbic~Region (Chancellor) on Feb 25, 2005 at 14:40 UTC | |
Re^2: How to maximise the content of my data CD
by MidLifeXis (Monsignor) on Feb 25, 2005 at 17:57 UTC | |
by Anonymous Monk on Feb 25, 2005 at 22:22 UTC | |
by Zero_Flop (Pilgrim) on Feb 26, 2005 at 05:58 UTC | |
by Limbic~Region (Chancellor) on Feb 27, 2005 at 18:07 UTC | |
by MidLifeXis (Monsignor) on Feb 28, 2005 at 18:53 UTC | |
by Limbic~Region (Chancellor) on Feb 28, 2005 at 19:12 UTC |