I have a PowerShell script:
(Select-String -path c:\*.txt -Pattern 'searchstring').Line | set-content d:\output.txt
It works fine but the output file is too big. What I would like is for it to output multiple files so output.txt is the same number of files as is being read in with the wild card.
So:
c:\1.txt
c:\2.txt
c:\3.txt
Would output the matched lines to:
d:\1.txt
d:\2.txt
d:\3.txt
.Filenameproperty of the objects returned bySelect-String? you can group on that and then output to a new file based on the items in each group..Path. I had considered aGroup-Objectsolution, but it is more memory-intensive (and possibly also slower overall) than callingSelect-Stringon each input file, and the solution isn't any shorter.-Path. [grin] however, i suspect you are correct about the likely RAM needs. i have noticed that piping from G-CI to S-S is notably slower than using S-S to grab the files directly, so it may not be slower to use G-O in that situation.FileNamefor grouping - it may work in this specific case, but you're asking for trouble in other cases. The speed tradeoff doesn't come from what you describe - there's virtually no difference between piping file-info objects vs. using-Pathto a singleSelect-Stringcall - but comes from requiring multipleSelect-Stringcalls. That is balanced against the processing (not just memory) overhead that grouping brings.