How do you make the choices? Once you can answer those questions, you'll be able to make some progress towards what you want.
I'd say ... an algorithm of minimum requirements could do the trick here.
for example:
- 10 url's of the same domain of a 95% "similarity" would be matched as "clean url".
- 100 url's of the same domain would give more weight
- The string detected has to be atleast 4 characters difference before it adds weight/validity
- Any others could be selected manually or kept in a cache till they offer a difference
Some url's will be easier than some others (especially script url's); but most sites got a general structure, to be SEO friendly towards search engines; which would make most URL's a heaven to work with...