If I understand correctly, you want to build a searchable index of your documents, with 10,000 distinct indexable terms, and 100,000 synonyms for those terms. The global substitution seems like a bad way to go, and it's not clear to me how you're going to only do 10,000 subs. It seems like you should have 100,000 keys and 10,000 values in your %termhash.
I think what you'd want to do is, given a synonym hash,
make a reverse mapping that keys the unique values to a regex that alternates among all the keys.
my %synhash = ( meat => 'meat', ham => 'meat', beef => 'meat');
my %revsynhash;
# Collect the synonyms as an array
while (my ($k,$v) = each %synhash) {
push @{$revsynhash{$v}}, $k;
}
# Turn the arrays into alternative-lists, longest first
while (my $k = each %revsynhash) {
$revsynhash{$k} = join '|', sort {length $b <=> length $a} @{$revsyn
+hash{$k}};
}
# Now you have
# %revsynhash = ( meat => 'beef|meat|ham')
So you run your indexing scheme using the 10,000 items in %revsynhash, the keys being the indexed term, while the values are the regexes to count as hits for those terms. When someone wants to look up a term, you use %synhash to translate it, and hit your index for what it translated to.
Caution: Contents may have been coded under pressure.