I have a command line utility from a third party (it's big and written in Java) that I've been using to help me process some data. This utility expects information in a line delimited file and then outputs processed data to STDOUT.
In my testing phases, I was fine with writing some Perl to create a file full of information to be processed and then sending that file to this third party utility, but as I'm nearing putting this code in production, I'd really prefer to just pipe data to this utility directly instead of first writing that data to a file as this would save me the overhead of having to write unneeded information to disk. I recently asked on this board how I could do this in Unix, but have since realized that it would be incredibly more convenient to actually run it directly out of a Perl module. Perhaps something like:
system(bin/someapp do-action --option1 some_value --input $piped_in_data)
Currently I call the utility as follows:
bin/someapp do-action --option1 some_value --input some_file
Basically, what I want is to write all my data either to a variable or to STDOUT and then to pipe it to the Java app through a system call in the SAME Perl script or module. This would make my code a lot more fluid. Without it, I'd wind up needing to write a Perl script which calls a bash file half way through which in turn would need to call another Perl script to prep data. If at all possible I'd love to just stay in Perl the whole way through. Any ideas?