Re: best way to pass data
by blokhead (Monsignor) on Aug 21, 2003 at 19:35 UTC
|
open my $fh, "|/tmp/scripts.pl"
or die "Can't pipe scripts.pl: $!";
## send the data to script.pl:
print $fh @ARRAY;
## or:
print $fh join "\n", @ARRAY;
## or probably best:
print $fh $_ for @ARRAY;
## or, depending on what's exactly in @ARRAY:
print $fh "$_\n" for @ARRAY;
Then have script.pl read in the data from its STDIN. If it can process this data as it comes in (instead of needing all the data before doing its thing), then the last two options above will be the most efficient, as they send one element of the array at a time to script.pl.
Whether this is the best way, I don't know. But it is a much better way than trying to have thousands of lines of code interpolated within backticks!
blokhead | [reply] [Watch: Dir/Any] [d/l] |
Re: best way to pass data
by ehdonhon (Curate) on Aug 21, 2003 at 19:33 UTC
|
Well, if that gets converted to a '/bin/sh -c', it will probably crash if @array really does have thousands of lines in it. There is a hard limit to the number of parameters that sh will take.
Here is one solution:
open( PROG, '| /tmp/scripts.pl' );
for ( @ARRAY ) {
print PROG $_;
}
close( PROG );
If you do it that way, then you will have to write your other script to accept the data via STDIN.
If this is too slow for you, you might want to take a look at IPC::Shareable::SharedMem for a shared memory solution.
| [reply] [Watch: Dir/Any] [d/l] |
Re: best way to pass data
by jmanning2k (Pilgrim) on Aug 21, 2003 at 19:44 UTC
|
There are much, much better ways. The command you listed is enough to make most people cringe (myself included).
Passing the whole array as an argument on the command line will probably fail. There's simply a limit to how long the argument list can be, and I doubt it can have newlines.
Here's a quick solution (many others are possible):
Make the recieving script read from STDIN (ie. while(<>) { chomp; print; }).
Then instead of backtics, use open.
# Note the | here. This allows you to write to your script
open(SCRIPTS, "| /tmp/scripts.pl") or die;
foreach my $line (@ARRAY) {
print SCRIPTS $line . "\n";
}
close SCRIPTS;
Hope this helps, ~J | [reply] [Watch: Dir/Any] [d/l] [select] |
|
Sorry to be stupid here but how do I get the recieving script to put the stuff into an array to use? I am sending a web page to the second script to process.
Ive tried:
$uploader = $query->upload("infile");
while (<$uploader>){
push (@import_files, $uploader);
}
And
while (<>){
push (@import_files, shift;);
}
and six or seven varants, please help.
Thanks
Rishard
| [reply] [Watch: Dir/Any] |
|
Found this here: http://dbforums.com/arch/95/2002/6/400241
Basically
##Master
...some code...
system ("./slave.pl @array");
...end code...
##Slave
@import_array = @ARGV;
...code...
rishard
| [reply] [Watch: Dir/Any] |
|
|
Re: best way to pass data
by bear0053 (Hermit) on Aug 21, 2003 at 19:36 UTC
|
An alternative way would be to declare @array as:
our @array;
require ('/tmp/scripts.pl');
now scripts.pl will be able to use @array (as long as it is declared as our in there as well) and you will be using strict so your code should be easy to follow.
Or you can leave @array as a
my @array;
require ('/tmp/scripts.pl');
someFunction(@array);
this way you can just pass in @array to a function in the other script and that should work as well.
| [reply] [Watch: Dir/Any] [d/l] [select] |
Re: best way to pass data
by NetWallah (Canon) on Aug 22, 2003 at 04:55 UTC
|
A general solution to this issue is discussed in theObject Serialization Basics node, which discusses the following modules:
- Storable
- FreezeThaw
- Data::Dumper
| [reply] [Watch: Dir/Any] |
|
I would also recommend looking at the YAML module as a possible alternative to Data::Dumper. That way, if you ever want to read or print your data structure so that humans can understand it, you have very little to do!
--Attila (need to find my passwd...)
| [reply] [Watch: Dir/Any] |
Re: best way to pass data
by Anonymous Monk on Aug 23, 2003 at 09:11 UTC
|
I fear you have a mistake in your concept. Why dont you put the 'scripts.pl' functionality into a module and call it then from you main script. Thats what suggested above.
Read the perlmodtut. Its soo simple, here an example.
In Scripts.pm
package Scripts;
sub do_something {}
1;
in main.pl.
use Scripts;
my @ARRAY = <DATA>;
Scripts::do_something( \@ARRAY );
E voila !
Murat
edited by ybiC: remove <pre> tags, format with <p> and <br /> tags instead, to eliminate lateral scrolling | [reply] [Watch: Dir/Any] [d/l] [select] |
|
The reason I listed the response dated Aug 23, 2003 at 04:40 is that the Master.pl runs very fast. One of the calls in in a loop so this will call several instances of Slave.pl which quieries outside databases and has a long delay. I do not want to hold up Master.pl waiting on Slave.pl, I just want things to go. In my second need for this, Master2.pl calls and sends data to three different concurrent processes. Master does not need to know the success or failure of the processes, that is taken care of by a seperate process. rishard
| [reply] [Watch: Dir/Any] |