I am trying to restore objects from glacier using aws-cli as follow:
aws s3api restore-object --restore-request Days=7 --bucket mybucket --key some-file.ext
This works nice. However I want to do this in a bunch of files by script but if try something like:
$ export I="some-file.ext"
$ aws s3api restore-object --restore-request Days=7 --bucket mybucket --key $I
aws-cli returns this error:
An error occurred (404) when calling the RestoreObject operation: Not Found
No matter what $I contains, if I manualy take the value and pass to aws-cli it works:
$ echo $I
some-other-file.ext
$ aws s3api restore-object --restore-request Days=7 --bucket mybucket --key some-other-file.ext
If I repeat the operation over the same key it returns a confirmation:
$ aws s3api restore-object --restore-request Days=7 --bucket mybucket --key some-other-file.ext
An error occurred (RestoreAlreadyInProgress) when calling the RestoreObject operation: Object restore is already in progress
My version of aws-cli is as follows:
$ aws --version
aws-cli/1.10.50 Python/2.7.6 Linux/3.13.0-92-generic botocore/1.4.40
How can I pass -key parameter using bash variable? Other suggestions? Xargs also fails although with another error:
$ {echo list of files one per line} | xargs -L 1 aws s3api restore-object --restore-request Days=7 --bucket mybucket --key
An error occurred (404) when calling the RestoreObject operation: Not Found xargs: aws: exited with status 255; aborting
or even without -L 1 xargs print the list of files separated by , and:
xargs: aws: exited with status 255; aborting
Even trying this answer it fails as 404 like above.
Any idea on how can I pass file paths in a "script-fashion" to aws-cli?