I have a simple PowerShell script that runs through a directory tree, and lists the files in JSON format.
Each entry is of the form:
{id: filename, size: bytes }
Works fine for short listings, but very slow for large directories. I also want to write the contents to a file (manifest.json).
I am much better at writing C# .NET (I would use Directory.EnumerateFiles() )
But I thought I would see if I can't get simple things done easier in powershell.
But this script really bogs down when I get to 10K entries.
$src = "G:\wwwroot\BaseMaps\BigBlueMarble"
$path = $src + "\*"
$excludes = @("*.json", "*.ps1")
$version = "1.1"
Write-Host "{"
Write-Host "`"manifest-version`": `"$version`","
Write-Host "`"files`": ["
$dirs = Get-Item -Path $path -Exclude $excludes
$dirs | Get-ChildItem -Recurse -File | % {
$fpath = $_.FullName.Replace($src, "").Replace("\","/")
$date = $_.LastWriteTime
$size = $_.Length
$id = $_.BaseName
Write-Host "{`"id`": `"$id`", `"size`": `"$size`"},"
}
Write-Host "]"
Write-Host "}"
Get-ChildItemis slow. Better stick with C#/.net for this. See this answer to a similar question.Convertto-JSONwhich may be faster than string concatenation)?