I am new at scripting and I am trying to concatenate multiple files, whose paths are listed a text file and output a combined gzip file. for example list file - File_list.txt contains these file paths
/data/path/file1.txt
data2/path2/file2.txt
....file3.txt
....file4.txt
So far my code is for all files listed in a local directory ( outputs only combined file not gzipped):
#!/usr/bin/perl
use strict;
use File::Slurp;
my $directory = 'Users/xyz/Documents/';
opendir(dir, $directory) or die $!;
my @files = readdir(dir);
closedir dir;
my $outfilename = 'Combined.fastq'
my $outfilesrc = undef;
foreach (sort @files){
$outfilesrc.= File::Slurp::slurp("$basedir/$_");
}
open(OUT, "> $basedir/$outfilename") or die ("Can't open for writing:
$basedir/$outfilename : $!");
print OUT $outfilesrc;
close OUT;
exit;
Can someone please share how to read the files using this list rather than one single directory? I know it's much easier in simple bash but i am trying to create a module for a pipeline so need this in Perl. Thanks!
@files. See The correct way to read a data file into an array