Beefy Boxes and Bandwidth Generously Provided by pair Networks
We don't bite newbies here... much
 
PerlMonks  

Download big file in chunks with plack response

by Thenothing (Acolyte)
on Apr 21, 2021 at 10:04 UTC ( #11131529=perlquestion: print w/replies, xml ) Need Help??

Thenothing has asked for the wisdom of the Perl Monks concerning the following question:

Hello everyone I really tried to fix this problem but this time I need his help. This code work to download a file with little size:
use strict; use warnings; use Plack::Response; use Path::Tiny; use File::MimeInfo; sub return_psgi { my $self = shift; my $path_file = shift; my $content_type = mimetype($path_file); $content_type = 'application/octet-stream' if not defined $content +_type; my $content = path($path_file); my $content_download = $content->slurp; require File::Spec::Unix; my $basename = $path_file; $basename =~ s|\\|/|g; $basename = ( File::Spec::Unix->splitpath($basename) )[2]; $basename =~ s|[^\w\.-]+|_|g; my $response = Plack::Response->new(200); $response->headers( [ 'Content-Type' => $content_type, 'Content-Disposition' => qq[attachment; filename="$basename +"], ], ); $response->body($content_download); return $response->finalize; }
But I want to download big files greater of 1GB, so change to download in chunks, to this code:
open (FILE, "< $path_file") or die "can't open $path_file: $!"; binmode FILE; local $/ = \102400; while (<FILE>) { my $var = scalar $_; $response->body($var); return $response->finalize; } close FILE;
Another intent, using autoflush:
use IO::Handle; my $BLOCK_SIZE = 1024 * 1024; open(my $fh, "<:raw", $path_file) or die "Can't read from input file $ +path_file $!\n"; while (read($fh, $buffer, $BLOCK_SIZE)) { $response->body($buffer); $fh->autoflush; } close $fh; return $response->finalize;
The 2 previous intentions download file a size of kb, corrupt file.

can you please tell me where is my error in my code?

Replies are listed 'Best First'.
Re: Download big file in chunks with plack response
by choroba (Cardinal) on Apr 21, 2021 at 10:15 UTC
    while (<FILE>) { my $var = scalar $_; $response->body($var); return $response->finalize; }

    $_ is already a scalar. Calling it in a scalar context doesn't change anything, so you can also use

    my $var = $_;

    If you return from a while loop, the loop is done. In fact, the whole subroutine is done. The next iteration never happens.

    autoflush is documented in IO::Handle:

    Any unread data in the buffer will be discarded

    Are you sure you want to do that?

    Update: You used flush on the input, not output. But perlvar for $| which is the same as autoflush says:

    This has no effect on input buffering

    map{substr$_->[0],$_->[1]||0,1}[\*||{},3],[[]],[ref qr-1,-,-1],[{}],[sub{}^*ARGV,3]
Re: Download big file in chunks with plack response
by kikuchiyo (Hermit) on Apr 21, 2021 at 14:32 UTC

    See the section Delayed Response and Streaming Body in the PSGI specification.

    What you want is something like this (untested):

    use strict; use warnings; ####use Plack::Response; # doesn't seem to support streaming response use Path::Tiny; use File::MimeInfo; sub return_psgi { my $self = shift; my $path_file = shift; my $content_type = mimetype($path_file); $content_type = 'application/octet-stream' if not defined $content +_type; my $content = path($path_file); my $content_download = $content->slurp; require File::Spec::Unix; my $basename = $path_file; $basename =~ s|\\|/|g; $basename = ( File::Spec::Unix->splitpath($basename) )[2]; $basename =~ s|[^\w\.-]+|_|g; return sub { my $responder = shift; my $writer = $responder->( [ 200, [ 'Content-Type' => $content_type, 'Content-Disposition' => qq[attachment; filename="$ +basename"], ] ] ); open (FILE, "< $path_file") or die "can't open $path_file: $!" +; binmode FILE; local $/ = \102400; while (<FILE>) { $writer->write($_); } $writer->close(); close FILE; } }
Re: Download big file in chunks with plack response
by Anonymous Monk on Apr 21, 2021 at 22:03 UTC
Re: Download big file in chunks with plack response
by Thenothing (Acolyte) on Apr 26, 2021 at 00:14 UTC
    Updated in the end

      here is the final code:

      Correction ;)

      my $app = sub { use Path::Tiny qw/ path /; use File::MimeInfo qw/ mimetype /; my $env = shift; my $path_file = 'win.iso'; my $content_type = mimetype($path_file) || 'application/octet-stre +am'; my $basename = path($path_file)->basename; $basename =~ s{[^\w\.-]+}{_}g; my $filehandle = path($path_file)->openr_raw ; return [ 200, [ 'Content-Type' => $content_type, 'Content-Length' => -s $filehandle, 'Content-Disposition' => qq[attachment; filename="$basenam +e"], ], $filehandle ]; };
Re: Download big file in chunks with plack response
by Thenothing (Acolyte) on Apr 22, 2021 at 16:12 UTC
    Thanks to all your participation, I tried what suggested kikuchiyo but the result is like my first code working, I get this error Out of memory

    does anyone know some app who has this feature and work with psgi?

      Ehh, that's what I get for not testing my submission.

      Comment out these lines:

      my $content = path($path_file); my $content_download = $content->slurp;
        Hi, yes I did that before to run.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://11131529]
Approved by haukex
Front-paged by haukex
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others avoiding work at the Monastery: (5)
As of 2023-12-02 18:56 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    What's your preferred 'use VERSION' for new CPAN modules in 2023?











    Results (18 votes). Check out past polls.

    Notices?