Oh,
eww. Many of the solutions so far will completely fall apart with a distinct keyword set that large.
This problem just exploded. Really. Can you give an estimate on:
- K - the number of distinct keywords,
- I - the number of entries in the %items hash,
- L - the average number of keywords per %items entry, and
- N - the number of words taken at a time
I suspect that many of these solutions will show either time or memory complexity of O(K!/(N!(K-N)!)) - I know that mine has the potential to show memory complexity like that (though the %totals has could be tied to a massive dbm file), and at least one of the other solutions has time complexity that looks like that.
Of course, if L is sufficiently large, you're screwed too, since I don't think there's a way to not spend at least O(L!/(N!(L-N)!)) time.
--
@/=map{[/./g]}qw/.h_nJ Xapou cets krht ele_ r_ra/;
map{y/X_/\n /;print}map{pop@$_}@/for@/