There's predictability and there's predictibility. It tends to be bad for simulations if your "random" numbers start being the same on every pass through the main loop.
It all depends on the application. Perhaps you are looking for something in a large set with a certain property. So, rather than iterate through, you pick a random starting point. IIRC, there are non-deterministic primality tests that take a random input and tell you whether a number is composite or not. In this case, you want to keep the one random input that tells you that the number under examination is composite if you want to prove it. If I can find an example of such an algorithm later, I'll update this node.
Entropy measures the number of possible states of a system. Determinism is about the transitions between states. The two concepts are largely orthogonal.
Yes. However, this does not contradict anything that I said. Some people were concerned about repeated calls to rand() reducing the strength of randomness of data received from places like /dev/rand and such. I was saying that this was not the case, as perl uses a PRNG, which given a specific input is completely deterministic.
Update: Check this out.