But it's only set up once.
In merlyn's article, the weights change every time a random value is chosen. It's reasonable that one might want to do that. It seems like it should be possible to set up some kind of search tree that will work in O(log n) time in that situation, but that's making my brain hurt.
Update: Couldn't let this go. Use a balanced binary tree. Each node knows its own weight, and the total weight of its subtree. Let $val = rand($tree->{total_weight}) and then
sub search {
my ($tree, $val) = @_;
return $tree if $val < $tree->{weight};
$val -= $tree->{weight};
return search($tree->{left}, $val) if $val < $tree->{left}->{total_w
+eght};
$val -= $tree->{left}->{total_weight};
return search($tree->{right}, $val);
}
Insertions, deletions, and weight changes must recalculate up to log(n) weights. Now where's the brain specialist...
...up until you need bigints for the indexes.
Where are you going to store an array that big? |