0

I am using a script to call URLs, get the output, and save the output to a text file. It all works great. However, when I need to call multiple URLs in succession it causes CPU spikes because it is starting/stopping powershell.exe so many times. Is there a whay I could use a for each technique for each URL while still saving the output? Here is my script:

$content = Get-Content $PSScriptRoot\urls.txt

echo "Testing for $content"

(Invoke-WebRequest -Uri "$content").Content |
    Out-File -FilePath "$PSScriptRoot\out.txt"

$status = Get-Content $PSScriptRoot\out.txt

Note that the two echo are just for debugging purposes, don't really matter.

2
  • You mean something like Get-Content urls.txt | ForEach-Object { Invoke-WebRequest -Uri $_ | Select-Object -Expand Content }? Commented Jul 5, 2017 at 11:04
  • yup! That is what I needed. Commented Jul 5, 2017 at 16:57

1 Answer 1

1

Is this Something that you wanted to do?

$contents = Get-Content $PSScriptRoot\urls.txt
foreach($content in $contents){
(Invoke-WebRequest -Uri "$content").Content |Out-File -FilePath $PSScriptRoot\out.txt -Append
}

I Wrote a urls.txt with 7-8 urls,And it worked fine for me.

Sign up to request clarification or add additional context in comments.

3 Comments

aha I was missing the { } and the -Append. Perfect my friend!
Can't Chetan Not enough rep! sry.
Cool.no problem! Get back when you have rep,Cheers

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.