I wrote a script to delete temporary files/folders older than 90 days on remote servers. The server.txt file is loaded with Get-Content, and I use 'net use' to map to the IPC$ share. I'm worried that I'm not using Best Practices to delete the old temp files. Here is the meat of my script:
net use \\$server\IPC$ /user:$Authname $pw /persistent:yes
Get-ChildItem -Path "\\$($server)\C$\Temp" -Recurse | Where-Object {!$_.PSIsContainer -and $_.LastAccessTime -lt $cutoffdate} | Remove-Item -Recurse
(Get-ChildItem -Path "\\$($server)\C$\Temp" -recurse | Where-Object {$_.PSIsContainer -eq $True}) | Where-Object {$_.GetFiles().Count -eq 0} | Remove-Item -Recurse
net use \\$Server\IPC$ /delete
The first gci deletes old files, the second deletes empty folders.
The reason I'm concerned is that in my initial tests, it's taking about a half hour to delete approximately 4 gb off of one server. And I work in a big shop; my script needs to be run against about 10,000 servers. At that rate my script won't be done for more than six months, and I was hoping to run it on a quarterly basis.
Am I doing something the hard way?