Come for the quick hacks, stay for the epiphanies. | |
PerlMonks |
Re: fetching and storing data from web.by tospo (Hermit) |
on Jan 27, 2012 at 09:25 UTC ( [id://950296]=note: print w/replies, xml ) | Need Help?? |
I agree with the previous posts that this is an ambitious project for a beginner but don't let that stop you. Maybe start with trying to read some data from one of your already downloaded web page with Perl without using any additional modules, just to get a grip on the language. For example, just read up on how to read a file and how to use pattern matching (regular expressions) to fetch certain data from a file according to textual context. There are plenty of examples for that which you can use as a starting point. You would then write the results to a simple text file. Then maybe try to modify that so that your output is a proper CSV file that can already be opened in Excel. This can be done simply by printing your data with commas in between and quoting text. No need for an external module in most cases (although there are modules like Text::CSV that help you with the more complex cases). Once you can do that. Try to fetch the data directly from the web with LWP::Simple instead of reading from a file. First write a script that uses LWP::simple just to download the whole page and print everything to a local file. Then try to combine that with your parser and you are almost done. If you really want the data in a proper database you should learn basic SQL (database query language) and the Perl way of interacting with a database (the DBI or DBIc - too much to get into details here), but be prepared that that's not going to be done in one day. Keep going and good luck!!
In Section
Seekers of Perl Wisdom
|
|