Welcome to the Monastery | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
This does analysis on the closing data of stocks and needs to finish by the next time around to be useful - 20 hours was the extreme cut off, 10 hours is much more favorable - and then anything below that is great. (also the number of stocks that we are looking at changes - it currently is actually below 2000 and is closer to 1000 - but over time that figure will grow to be over 2000 as more data is collected - faster hardware over time will help in that, but I still wanted to plan with the idea in mind that we would have to do around 2000 of them)
But like I said, once it gets down to the difference between 30 mins and an hour, it doesn't matter much to me. I have a cluster of nodes that currently price out at about $350 each - I could build them even cheaper, but I use silent components to try to reduce noise levels when working near them - and those also tend to use less power in order to be quieter. With the cluster it makes it feasible to take "slow" code and spread it out (as long as the task at hand lends itself to that) over several machines and get it done much faster. But it is certianly nice to have it run quickly on a single machine and not need the cluster tied up for that amount of time and instead have a single node cruise through it while the other nodes can work on other things. In reply to Re: Re: Confirming what we already knew
by AssFace
|
|